Dec 03 20:34:22 localhost kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec 03 20:34:22 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 03 20:34:22 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 03 20:34:22 localhost kernel: BIOS-provided physical RAM map:
Dec 03 20:34:22 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 03 20:34:22 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 03 20:34:22 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 03 20:34:22 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 03 20:34:22 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 03 20:34:22 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 03 20:34:22 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 03 20:34:22 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 03 20:34:22 localhost kernel: NX (Execute Disable) protection: active
Dec 03 20:34:22 localhost kernel: APIC: Static calls initialized
Dec 03 20:34:22 localhost kernel: SMBIOS 2.8 present.
Dec 03 20:34:22 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 03 20:34:22 localhost kernel: Hypervisor detected: KVM
Dec 03 20:34:22 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 03 20:34:22 localhost kernel: kvm-clock: using sched offset of 3274772429 cycles
Dec 03 20:34:22 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 03 20:34:22 localhost kernel: tsc: Detected 2799.998 MHz processor
Dec 03 20:34:22 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 03 20:34:22 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 03 20:34:22 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 03 20:34:22 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 03 20:34:22 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 03 20:34:22 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 03 20:34:22 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 03 20:34:22 localhost kernel: Using GB pages for direct mapping
Dec 03 20:34:22 localhost kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec 03 20:34:22 localhost kernel: ACPI: Early table checksum verification disabled
Dec 03 20:34:22 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 03 20:34:22 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 03 20:34:22 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 03 20:34:22 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 03 20:34:22 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 03 20:34:22 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 03 20:34:22 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 03 20:34:22 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 03 20:34:22 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 03 20:34:22 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 03 20:34:22 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 03 20:34:22 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 03 20:34:22 localhost kernel: No NUMA configuration found
Dec 03 20:34:22 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 03 20:34:22 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec 03 20:34:22 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 03 20:34:22 localhost kernel: Zone ranges:
Dec 03 20:34:22 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 03 20:34:22 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 03 20:34:22 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 03 20:34:22 localhost kernel:   Device   empty
Dec 03 20:34:22 localhost kernel: Movable zone start for each node
Dec 03 20:34:22 localhost kernel: Early memory node ranges
Dec 03 20:34:22 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 03 20:34:22 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 03 20:34:22 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 03 20:34:22 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 03 20:34:22 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 03 20:34:22 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 03 20:34:22 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 03 20:34:22 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 03 20:34:22 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 03 20:34:22 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 03 20:34:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 03 20:34:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 03 20:34:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 03 20:34:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 03 20:34:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 03 20:34:22 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 03 20:34:22 localhost kernel: TSC deadline timer available
Dec 03 20:34:22 localhost kernel: CPU topo: Max. logical packages:   8
Dec 03 20:34:22 localhost kernel: CPU topo: Max. logical dies:       8
Dec 03 20:34:22 localhost kernel: CPU topo: Max. dies per package:   1
Dec 03 20:34:22 localhost kernel: CPU topo: Max. threads per core:   1
Dec 03 20:34:22 localhost kernel: CPU topo: Num. cores per package:     1
Dec 03 20:34:22 localhost kernel: CPU topo: Num. threads per package:   1
Dec 03 20:34:22 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 03 20:34:22 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 03 20:34:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 03 20:34:22 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 03 20:34:22 localhost kernel: Booting paravirtualized kernel on KVM
Dec 03 20:34:22 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 03 20:34:22 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 03 20:34:22 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 03 20:34:22 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 03 20:34:22 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 03 20:34:22 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 03 20:34:22 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 03 20:34:22 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec 03 20:34:22 localhost kernel: random: crng init done
Dec 03 20:34:22 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 03 20:34:22 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 03 20:34:22 localhost kernel: Fallback order for Node 0: 0 
Dec 03 20:34:22 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 03 20:34:22 localhost kernel: Policy zone: Normal
Dec 03 20:34:22 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 03 20:34:22 localhost kernel: software IO TLB: area num 8.
Dec 03 20:34:22 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 03 20:34:22 localhost kernel: ftrace: allocating 49335 entries in 193 pages
Dec 03 20:34:22 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 03 20:34:22 localhost kernel: Dynamic Preempt: voluntary
Dec 03 20:34:22 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 03 20:34:22 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 03 20:34:22 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 03 20:34:22 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 03 20:34:22 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 03 20:34:22 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 03 20:34:22 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 03 20:34:22 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 03 20:34:22 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 03 20:34:22 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 03 20:34:22 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 03 20:34:22 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 03 20:34:22 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 03 20:34:22 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 03 20:34:22 localhost kernel: Console: colour VGA+ 80x25
Dec 03 20:34:22 localhost kernel: printk: console [ttyS0] enabled
Dec 03 20:34:22 localhost kernel: ACPI: Core revision 20230331
Dec 03 20:34:22 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 03 20:34:22 localhost kernel: x2apic enabled
Dec 03 20:34:22 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 03 20:34:22 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 03 20:34:22 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 03 20:34:22 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 03 20:34:22 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 03 20:34:22 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 03 20:34:22 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 03 20:34:22 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 03 20:34:22 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 03 20:34:22 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 03 20:34:22 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 03 20:34:22 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 03 20:34:22 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 03 20:34:22 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 03 20:34:22 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 03 20:34:22 localhost kernel: x86/bugs: return thunk changed
Dec 03 20:34:22 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 03 20:34:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 03 20:34:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 03 20:34:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 03 20:34:22 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 03 20:34:22 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 03 20:34:22 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 03 20:34:22 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 03 20:34:22 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 03 20:34:22 localhost kernel: landlock: Up and running.
Dec 03 20:34:22 localhost kernel: Yama: becoming mindful.
Dec 03 20:34:22 localhost kernel: SELinux:  Initializing.
Dec 03 20:34:22 localhost kernel: LSM support for eBPF active
Dec 03 20:34:22 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 03 20:34:22 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 03 20:34:22 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 03 20:34:22 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 03 20:34:22 localhost kernel: ... version:                0
Dec 03 20:34:22 localhost kernel: ... bit width:              48
Dec 03 20:34:22 localhost kernel: ... generic registers:      6
Dec 03 20:34:22 localhost kernel: ... value mask:             0000ffffffffffff
Dec 03 20:34:22 localhost kernel: ... max period:             00007fffffffffff
Dec 03 20:34:22 localhost kernel: ... fixed-purpose events:   0
Dec 03 20:34:22 localhost kernel: ... event mask:             000000000000003f
Dec 03 20:34:22 localhost kernel: signal: max sigframe size: 1776
Dec 03 20:34:22 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 03 20:34:22 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 03 20:34:22 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 03 20:34:22 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 03 20:34:22 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 03 20:34:22 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 03 20:34:22 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 03 20:34:22 localhost kernel: node 0 deferred pages initialised in 9ms
Dec 03 20:34:22 localhost kernel: Memory: 7763932K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618208K reserved, 0K cma-reserved)
Dec 03 20:34:22 localhost kernel: devtmpfs: initialized
Dec 03 20:34:22 localhost kernel: x86/mm: Memory block size: 128MB
Dec 03 20:34:22 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 03 20:34:22 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 03 20:34:22 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 03 20:34:22 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 03 20:34:22 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 03 20:34:22 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 03 20:34:22 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 03 20:34:22 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 03 20:34:22 localhost kernel: audit: type=2000 audit(1764794060.641:1): state=initialized audit_enabled=0 res=1
Dec 03 20:34:22 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 03 20:34:22 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 03 20:34:22 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 03 20:34:22 localhost kernel: cpuidle: using governor menu
Dec 03 20:34:22 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 03 20:34:22 localhost kernel: PCI: Using configuration type 1 for base access
Dec 03 20:34:22 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 03 20:34:22 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 03 20:34:22 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 03 20:34:22 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 03 20:34:22 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 03 20:34:22 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 03 20:34:22 localhost kernel: Demotion targets for Node 0: null
Dec 03 20:34:22 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 03 20:34:22 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 03 20:34:22 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 03 20:34:22 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 03 20:34:22 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 03 20:34:22 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 03 20:34:22 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 03 20:34:22 localhost kernel: ACPI: Interpreter enabled
Dec 03 20:34:22 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 03 20:34:22 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 03 20:34:22 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 03 20:34:22 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 03 20:34:22 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 03 20:34:22 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 03 20:34:22 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [3] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [4] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [5] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [6] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [7] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [8] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [9] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [10] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [11] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [12] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [13] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [14] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [15] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [16] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [17] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [18] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [19] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [20] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [21] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [22] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [23] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [24] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [25] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [26] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [27] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [28] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [29] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [30] registered
Dec 03 20:34:22 localhost kernel: acpiphp: Slot [31] registered
Dec 03 20:34:22 localhost kernel: PCI host bridge to bus 0000:00
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 03 20:34:22 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 03 20:34:22 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 03 20:34:22 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 03 20:34:22 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 03 20:34:22 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 03 20:34:22 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 03 20:34:22 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 03 20:34:22 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 03 20:34:22 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 03 20:34:22 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 03 20:34:22 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 03 20:34:22 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 03 20:34:22 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 03 20:34:22 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 03 20:34:22 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 03 20:34:22 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 03 20:34:22 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 03 20:34:22 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 03 20:34:22 localhost kernel: iommu: Default domain type: Translated
Dec 03 20:34:22 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 03 20:34:22 localhost kernel: SCSI subsystem initialized
Dec 03 20:34:22 localhost kernel: ACPI: bus type USB registered
Dec 03 20:34:22 localhost kernel: usbcore: registered new interface driver usbfs
Dec 03 20:34:22 localhost kernel: usbcore: registered new interface driver hub
Dec 03 20:34:22 localhost kernel: usbcore: registered new device driver usb
Dec 03 20:34:22 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 03 20:34:22 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 03 20:34:22 localhost kernel: PTP clock support registered
Dec 03 20:34:22 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 03 20:34:22 localhost kernel: NetLabel: Initializing
Dec 03 20:34:22 localhost kernel: NetLabel:  domain hash size = 128
Dec 03 20:34:22 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 03 20:34:22 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 03 20:34:22 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 03 20:34:22 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 03 20:34:22 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 03 20:34:22 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 03 20:34:22 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 03 20:34:22 localhost kernel: vgaarb: loaded
Dec 03 20:34:22 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 03 20:34:22 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 03 20:34:22 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 03 20:34:22 localhost kernel: pnp: PnP ACPI init
Dec 03 20:34:22 localhost kernel: pnp 00:03: [dma 2]
Dec 03 20:34:22 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 03 20:34:22 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 03 20:34:22 localhost kernel: NET: Registered PF_INET protocol family
Dec 03 20:34:22 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 03 20:34:22 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 03 20:34:22 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 03 20:34:22 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 03 20:34:22 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 03 20:34:22 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 03 20:34:22 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 03 20:34:22 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 03 20:34:22 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 03 20:34:22 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 03 20:34:22 localhost kernel: NET: Registered PF_XDP protocol family
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 03 20:34:22 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 03 20:34:22 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 03 20:34:22 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 03 20:34:22 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 71614 usecs
Dec 03 20:34:22 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 03 20:34:22 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 03 20:34:22 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 03 20:34:22 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 03 20:34:22 localhost kernel: ACPI: bus type thunderbolt registered
Dec 03 20:34:22 localhost kernel: Initialise system trusted keyrings
Dec 03 20:34:22 localhost kernel: Key type blacklist registered
Dec 03 20:34:22 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 03 20:34:22 localhost kernel: zbud: loaded
Dec 03 20:34:22 localhost kernel: integrity: Platform Keyring initialized
Dec 03 20:34:22 localhost kernel: integrity: Machine keyring initialized
Dec 03 20:34:22 localhost kernel: Freeing initrd memory: 87804K
Dec 03 20:34:22 localhost kernel: NET: Registered PF_ALG protocol family
Dec 03 20:34:22 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 03 20:34:22 localhost kernel: Key type asymmetric registered
Dec 03 20:34:22 localhost kernel: Asymmetric key parser 'x509' registered
Dec 03 20:34:22 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 03 20:34:22 localhost kernel: io scheduler mq-deadline registered
Dec 03 20:34:22 localhost kernel: io scheduler kyber registered
Dec 03 20:34:22 localhost kernel: io scheduler bfq registered
Dec 03 20:34:22 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 03 20:34:22 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 03 20:34:22 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 03 20:34:22 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 03 20:34:22 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 03 20:34:22 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 03 20:34:22 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 03 20:34:22 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 03 20:34:22 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 03 20:34:22 localhost kernel: Non-volatile memory driver v1.3
Dec 03 20:34:22 localhost kernel: rdac: device handler registered
Dec 03 20:34:22 localhost kernel: hp_sw: device handler registered
Dec 03 20:34:22 localhost kernel: emc: device handler registered
Dec 03 20:34:22 localhost kernel: alua: device handler registered
Dec 03 20:34:22 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 03 20:34:22 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 03 20:34:22 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 03 20:34:22 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 03 20:34:22 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 03 20:34:22 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 03 20:34:22 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 03 20:34:22 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec 03 20:34:22 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 03 20:34:22 localhost kernel: hub 1-0:1.0: USB hub found
Dec 03 20:34:22 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 03 20:34:22 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 03 20:34:22 localhost kernel: usbserial: USB Serial support registered for generic
Dec 03 20:34:22 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 03 20:34:22 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 03 20:34:22 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 03 20:34:22 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 03 20:34:22 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 03 20:34:22 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 03 20:34:22 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 03 20:34:22 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-03T20:34:21 UTC (1764794061)
Dec 03 20:34:22 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 03 20:34:22 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 03 20:34:22 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 03 20:34:22 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 03 20:34:22 localhost kernel: usbcore: registered new interface driver usbhid
Dec 03 20:34:22 localhost kernel: usbhid: USB HID core driver
Dec 03 20:34:22 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 03 20:34:22 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 03 20:34:22 localhost kernel: Initializing XFRM netlink socket
Dec 03 20:34:22 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 03 20:34:22 localhost kernel: Segment Routing with IPv6
Dec 03 20:34:22 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 03 20:34:22 localhost kernel: mpls_gso: MPLS GSO support
Dec 03 20:34:22 localhost kernel: IPI shorthand broadcast: enabled
Dec 03 20:34:22 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 03 20:34:22 localhost kernel: AES CTR mode by8 optimization enabled
Dec 03 20:34:22 localhost kernel: sched_clock: Marking stable (1169006450, 151021234)->(1458226480, -138198796)
Dec 03 20:34:22 localhost kernel: registered taskstats version 1
Dec 03 20:34:22 localhost kernel: Loading compiled-in X.509 certificates
Dec 03 20:34:22 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 03 20:34:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 03 20:34:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 03 20:34:22 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 03 20:34:22 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 03 20:34:22 localhost kernel: Demotion targets for Node 0: null
Dec 03 20:34:22 localhost kernel: page_owner is disabled
Dec 03 20:34:22 localhost kernel: Key type .fscrypt registered
Dec 03 20:34:22 localhost kernel: Key type fscrypt-provisioning registered
Dec 03 20:34:22 localhost kernel: Key type big_key registered
Dec 03 20:34:22 localhost kernel: Key type encrypted registered
Dec 03 20:34:22 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 03 20:34:22 localhost kernel: Loading compiled-in module X.509 certificates
Dec 03 20:34:22 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 03 20:34:22 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 03 20:34:22 localhost kernel: ima: No architecture policies found
Dec 03 20:34:22 localhost kernel: evm: Initialising EVM extended attributes:
Dec 03 20:34:22 localhost kernel: evm: security.selinux
Dec 03 20:34:22 localhost kernel: evm: security.SMACK64 (disabled)
Dec 03 20:34:22 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 03 20:34:22 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 03 20:34:22 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 03 20:34:22 localhost kernel: evm: security.apparmor (disabled)
Dec 03 20:34:22 localhost kernel: evm: security.ima
Dec 03 20:34:22 localhost kernel: evm: security.capability
Dec 03 20:34:22 localhost kernel: evm: HMAC attrs: 0x1
Dec 03 20:34:22 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 03 20:34:22 localhost kernel: Running certificate verification RSA selftest
Dec 03 20:34:22 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 03 20:34:22 localhost kernel: Running certificate verification ECDSA selftest
Dec 03 20:34:22 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 03 20:34:22 localhost kernel: clk: Disabling unused clocks
Dec 03 20:34:22 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 03 20:34:22 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec 03 20:34:22 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 03 20:34:22 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec 03 20:34:22 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 03 20:34:22 localhost kernel: Run /init as init process
Dec 03 20:34:22 localhost kernel:   with arguments:
Dec 03 20:34:22 localhost kernel:     /init
Dec 03 20:34:22 localhost kernel:   with environment:
Dec 03 20:34:22 localhost kernel:     HOME=/
Dec 03 20:34:22 localhost kernel:     TERM=linux
Dec 03 20:34:22 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64
Dec 03 20:34:22 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 03 20:34:22 localhost systemd[1]: Detected virtualization kvm.
Dec 03 20:34:22 localhost systemd[1]: Detected architecture x86-64.
Dec 03 20:34:22 localhost systemd[1]: Running in initrd.
Dec 03 20:34:22 localhost systemd[1]: No hostname configured, using default hostname.
Dec 03 20:34:22 localhost systemd[1]: Hostname set to <localhost>.
Dec 03 20:34:22 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 03 20:34:22 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 03 20:34:22 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 03 20:34:22 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 03 20:34:22 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 03 20:34:22 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 03 20:34:22 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 03 20:34:22 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 03 20:34:22 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 03 20:34:22 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 03 20:34:22 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 03 20:34:22 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 03 20:34:22 localhost systemd[1]: Reached target Local File Systems.
Dec 03 20:34:22 localhost systemd[1]: Reached target Path Units.
Dec 03 20:34:22 localhost systemd[1]: Reached target Slice Units.
Dec 03 20:34:22 localhost systemd[1]: Reached target Swaps.
Dec 03 20:34:22 localhost systemd[1]: Reached target Timer Units.
Dec 03 20:34:22 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 03 20:34:22 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 03 20:34:22 localhost systemd[1]: Listening on Journal Socket.
Dec 03 20:34:22 localhost systemd[1]: Listening on udev Control Socket.
Dec 03 20:34:22 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 03 20:34:22 localhost systemd[1]: Reached target Socket Units.
Dec 03 20:34:22 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 03 20:34:22 localhost systemd[1]: Starting Journal Service...
Dec 03 20:34:22 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 03 20:34:22 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 03 20:34:22 localhost systemd[1]: Starting Create System Users...
Dec 03 20:34:22 localhost systemd[1]: Starting Setup Virtual Console...
Dec 03 20:34:22 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 03 20:34:22 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 03 20:34:22 localhost systemd[1]: Finished Create System Users.
Dec 03 20:34:22 localhost systemd-journald[307]: Journal started
Dec 03 20:34:22 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/fe8087480a274a3c9875a9777da5fa17) is 8.0M, max 153.6M, 145.6M free.
Dec 03 20:34:22 localhost systemd-sysusers[311]: Creating group 'users' with GID 100.
Dec 03 20:34:22 localhost systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Dec 03 20:34:22 localhost systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 03 20:34:22 localhost systemd[1]: Started Journal Service.
Dec 03 20:34:22 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 03 20:34:22 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 03 20:34:22 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 03 20:34:22 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 03 20:34:22 localhost systemd[1]: Finished Setup Virtual Console.
Dec 03 20:34:22 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 03 20:34:22 localhost systemd[1]: Starting dracut cmdline hook...
Dec 03 20:34:22 localhost dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Dec 03 20:34:22 localhost dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 03 20:34:22 localhost systemd[1]: Finished dracut cmdline hook.
Dec 03 20:34:22 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 03 20:34:22 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 03 20:34:22 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 03 20:34:22 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 03 20:34:22 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 03 20:34:22 localhost kernel: RPC: Registered udp transport module.
Dec 03 20:34:22 localhost kernel: RPC: Registered tcp transport module.
Dec 03 20:34:22 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 03 20:34:22 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 03 20:34:22 localhost rpc.statd[443]: Version 2.5.4 starting
Dec 03 20:34:22 localhost rpc.statd[443]: Initializing NSM state
Dec 03 20:34:22 localhost rpc.idmapd[448]: Setting log level to 0
Dec 03 20:34:22 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 03 20:34:22 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 03 20:34:22 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Dec 03 20:34:22 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 03 20:34:22 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 03 20:34:22 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 03 20:34:22 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 03 20:34:22 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 03 20:34:22 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 03 20:34:22 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 03 20:34:22 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 03 20:34:22 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 03 20:34:22 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 03 20:34:22 localhost systemd[1]: Reached target Network.
Dec 03 20:34:22 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 03 20:34:22 localhost systemd[1]: Starting dracut initqueue hook...
Dec 03 20:34:22 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 03 20:34:22 localhost kernel: libata version 3.00 loaded.
Dec 03 20:34:22 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 03 20:34:22 localhost kernel:  vda: vda1
Dec 03 20:34:22 localhost systemd-udevd[492]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 20:34:22 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 03 20:34:22 localhost kernel: scsi host0: ata_piix
Dec 03 20:34:22 localhost kernel: scsi host1: ata_piix
Dec 03 20:34:22 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 03 20:34:22 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 03 20:34:22 localhost systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 03 20:34:23 localhost systemd[1]: Reached target Initrd Root Device.
Dec 03 20:34:23 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 03 20:34:23 localhost kernel: ata1: found unknown device (class 0)
Dec 03 20:34:23 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 03 20:34:23 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 03 20:34:23 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 03 20:34:23 localhost systemd[1]: Reached target System Initialization.
Dec 03 20:34:23 localhost systemd[1]: Reached target Basic System.
Dec 03 20:34:23 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 03 20:34:23 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 03 20:34:23 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 03 20:34:23 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 03 20:34:23 localhost systemd[1]: Finished dracut initqueue hook.
Dec 03 20:34:23 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 03 20:34:23 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 03 20:34:23 localhost systemd[1]: Reached target Remote File Systems.
Dec 03 20:34:23 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 03 20:34:23 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 03 20:34:23 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec 03 20:34:23 localhost systemd-fsck[558]: /usr/sbin/fsck.xfs: XFS file system.
Dec 03 20:34:23 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 03 20:34:23 localhost systemd[1]: Mounting /sysroot...
Dec 03 20:34:23 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 03 20:34:23 localhost kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec 03 20:34:23 localhost kernel: XFS (vda1): Ending clean mount
Dec 03 20:34:23 localhost systemd[1]: Mounted /sysroot.
Dec 03 20:34:23 localhost systemd[1]: Reached target Initrd Root File System.
Dec 03 20:34:23 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 03 20:34:23 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 03 20:34:23 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 03 20:34:23 localhost systemd[1]: Reached target Initrd File Systems.
Dec 03 20:34:23 localhost systemd[1]: Reached target Initrd Default Target.
Dec 03 20:34:23 localhost systemd[1]: Starting dracut mount hook...
Dec 03 20:34:23 localhost systemd[1]: Finished dracut mount hook.
Dec 03 20:34:23 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 03 20:34:24 localhost rpc.idmapd[448]: exiting on signal 15
Dec 03 20:34:24 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 03 20:34:24 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 03 20:34:24 localhost systemd[1]: Stopped target Network.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Timer Units.
Dec 03 20:34:24 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 03 20:34:24 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Basic System.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Path Units.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Remote File Systems.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Slice Units.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Socket Units.
Dec 03 20:34:24 localhost systemd[1]: Stopped target System Initialization.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Local File Systems.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Swaps.
Dec 03 20:34:24 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped dracut mount hook.
Dec 03 20:34:24 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 03 20:34:24 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 03 20:34:24 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 03 20:34:24 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 03 20:34:24 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 03 20:34:24 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 03 20:34:24 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 03 20:34:24 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 03 20:34:24 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 03 20:34:24 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 03 20:34:24 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 03 20:34:24 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Closed udev Control Socket.
Dec 03 20:34:24 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Closed udev Kernel Socket.
Dec 03 20:34:24 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 03 20:34:24 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 03 20:34:24 localhost systemd[1]: Starting Cleanup udev Database...
Dec 03 20:34:24 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 03 20:34:24 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 03 20:34:24 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Create System Users.
Dec 03 20:34:24 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Finished Cleanup udev Database.
Dec 03 20:34:24 localhost systemd[1]: Reached target Switch Root.
Dec 03 20:34:24 localhost systemd[1]: Starting Switch Root...
Dec 03 20:34:24 localhost systemd[1]: Switching root.
Dec 03 20:34:24 localhost systemd-journald[307]: Journal stopped
Dec 03 20:34:24 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Dec 03 20:34:24 localhost kernel: audit: type=1404 audit(1764794064.309:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 03 20:34:24 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 20:34:24 localhost kernel: SELinux:  policy capability open_perms=1
Dec 03 20:34:24 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 20:34:24 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 03 20:34:24 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 20:34:24 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 20:34:24 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 20:34:24 localhost kernel: audit: type=1403 audit(1764794064.443:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 03 20:34:24 localhost systemd[1]: Successfully loaded SELinux policy in 138.089ms.
Dec 03 20:34:24 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.006ms.
Dec 03 20:34:24 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 03 20:34:24 localhost systemd[1]: Detected virtualization kvm.
Dec 03 20:34:24 localhost systemd[1]: Detected architecture x86-64.
Dec 03 20:34:24 localhost systemd-rc-local-generator[643]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 20:34:24 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped Switch Root.
Dec 03 20:34:24 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 03 20:34:24 localhost systemd[1]: Created slice Slice /system/getty.
Dec 03 20:34:24 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 03 20:34:24 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 03 20:34:24 localhost systemd[1]: Created slice User and Session Slice.
Dec 03 20:34:24 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 03 20:34:24 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 03 20:34:24 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 03 20:34:24 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Switch Root.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 03 20:34:24 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 03 20:34:24 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 03 20:34:24 localhost systemd[1]: Reached target Path Units.
Dec 03 20:34:24 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 03 20:34:24 localhost systemd[1]: Reached target Slice Units.
Dec 03 20:34:24 localhost systemd[1]: Reached target Swaps.
Dec 03 20:34:24 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 03 20:34:24 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 03 20:34:24 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 03 20:34:24 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 03 20:34:24 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 03 20:34:24 localhost systemd[1]: Listening on udev Control Socket.
Dec 03 20:34:24 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 03 20:34:24 localhost systemd[1]: Mounting Huge Pages File System...
Dec 03 20:34:24 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 03 20:34:24 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 03 20:34:24 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 03 20:34:24 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 03 20:34:24 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 03 20:34:24 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 03 20:34:24 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 03 20:34:24 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 03 20:34:24 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 03 20:34:24 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 03 20:34:24 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 03 20:34:24 localhost systemd[1]: Stopped Journal Service.
Dec 03 20:34:24 localhost kernel: fuse: init (API version 7.37)
Dec 03 20:34:24 localhost systemd[1]: Starting Journal Service...
Dec 03 20:34:24 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 03 20:34:24 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 03 20:34:24 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 03 20:34:24 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 03 20:34:24 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 03 20:34:24 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 03 20:34:24 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 03 20:34:24 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 03 20:34:24 localhost systemd-journald[684]: Journal started
Dec 03 20:34:24 localhost systemd-journald[684]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 03 20:34:24 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 03 20:34:24 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Started Journal Service.
Dec 03 20:34:24 localhost systemd[1]: Mounted Huge Pages File System.
Dec 03 20:34:24 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 03 20:34:24 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 03 20:34:24 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 03 20:34:24 localhost kernel: ACPI: bus type drm_connector registered
Dec 03 20:34:24 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 03 20:34:24 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 03 20:34:24 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 03 20:34:24 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 03 20:34:24 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 03 20:34:24 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 03 20:34:24 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 03 20:34:24 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 03 20:34:24 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 03 20:34:24 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 03 20:34:25 localhost systemd[1]: Mounting FUSE Control File System...
Dec 03 20:34:25 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 03 20:34:25 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 03 20:34:25 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 03 20:34:25 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 03 20:34:25 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 03 20:34:25 localhost systemd[1]: Starting Create System Users...
Dec 03 20:34:25 localhost systemd[1]: Mounted FUSE Control File System.
Dec 03 20:34:25 localhost systemd-journald[684]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 03 20:34:25 localhost systemd-journald[684]: Received client request to flush runtime journal.
Dec 03 20:34:25 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 03 20:34:25 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 03 20:34:25 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 03 20:34:25 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 03 20:34:25 localhost systemd[1]: Finished Create System Users.
Dec 03 20:34:25 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 03 20:34:25 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 03 20:34:25 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 03 20:34:25 localhost systemd[1]: Reached target Local File Systems.
Dec 03 20:34:25 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 03 20:34:25 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 03 20:34:25 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 03 20:34:25 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 03 20:34:25 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 03 20:34:25 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 03 20:34:25 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 03 20:34:25 localhost bootctl[700]: Couldn't find EFI system partition, skipping.
Dec 03 20:34:25 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 03 20:34:25 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 03 20:34:25 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 03 20:34:25 localhost systemd[1]: Starting Security Auditing Service...
Dec 03 20:34:25 localhost systemd[1]: Starting RPC Bind...
Dec 03 20:34:25 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 03 20:34:25 localhost auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 03 20:34:25 localhost auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 03 20:34:25 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 03 20:34:25 localhost systemd[1]: Started RPC Bind.
Dec 03 20:34:25 localhost augenrules[711]: /sbin/augenrules: No change
Dec 03 20:34:25 localhost augenrules[726]: No rules
Dec 03 20:34:25 localhost augenrules[726]: enabled 1
Dec 03 20:34:25 localhost augenrules[726]: failure 1
Dec 03 20:34:25 localhost augenrules[726]: pid 706
Dec 03 20:34:25 localhost augenrules[726]: rate_limit 0
Dec 03 20:34:25 localhost augenrules[726]: backlog_limit 8192
Dec 03 20:34:25 localhost augenrules[726]: lost 0
Dec 03 20:34:25 localhost augenrules[726]: backlog 3
Dec 03 20:34:25 localhost augenrules[726]: backlog_wait_time 60000
Dec 03 20:34:25 localhost augenrules[726]: backlog_wait_time_actual 0
Dec 03 20:34:25 localhost augenrules[726]: enabled 1
Dec 03 20:34:25 localhost augenrules[726]: failure 1
Dec 03 20:34:25 localhost augenrules[726]: pid 706
Dec 03 20:34:25 localhost augenrules[726]: rate_limit 0
Dec 03 20:34:25 localhost augenrules[726]: backlog_limit 8192
Dec 03 20:34:25 localhost augenrules[726]: lost 0
Dec 03 20:34:25 localhost augenrules[726]: backlog 0
Dec 03 20:34:25 localhost augenrules[726]: backlog_wait_time 60000
Dec 03 20:34:25 localhost augenrules[726]: backlog_wait_time_actual 0
Dec 03 20:34:25 localhost augenrules[726]: enabled 1
Dec 03 20:34:25 localhost augenrules[726]: failure 1
Dec 03 20:34:25 localhost augenrules[726]: pid 706
Dec 03 20:34:25 localhost augenrules[726]: rate_limit 0
Dec 03 20:34:25 localhost augenrules[726]: backlog_limit 8192
Dec 03 20:34:25 localhost augenrules[726]: lost 0
Dec 03 20:34:25 localhost augenrules[726]: backlog 0
Dec 03 20:34:25 localhost augenrules[726]: backlog_wait_time 60000
Dec 03 20:34:25 localhost augenrules[726]: backlog_wait_time_actual 0
Dec 03 20:34:25 localhost systemd[1]: Started Security Auditing Service.
Dec 03 20:34:25 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 03 20:34:25 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 03 20:34:25 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 03 20:34:25 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 03 20:34:25 localhost systemd[1]: Starting Update is Completed...
Dec 03 20:34:25 localhost systemd[1]: Finished Update is Completed.
Dec 03 20:34:25 localhost systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Dec 03 20:34:25 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 03 20:34:25 localhost systemd[1]: Reached target System Initialization.
Dec 03 20:34:25 localhost systemd[1]: Started dnf makecache --timer.
Dec 03 20:34:25 localhost systemd[1]: Started Daily rotation of log files.
Dec 03 20:34:25 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 03 20:34:25 localhost systemd[1]: Reached target Timer Units.
Dec 03 20:34:25 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 03 20:34:25 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 03 20:34:25 localhost systemd[1]: Reached target Socket Units.
Dec 03 20:34:25 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 03 20:34:25 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 03 20:34:25 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 03 20:34:25 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 03 20:34:25 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 03 20:34:25 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 03 20:34:25 localhost systemd-udevd[748]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 20:34:25 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 03 20:34:25 localhost systemd[1]: Reached target Basic System.
Dec 03 20:34:25 localhost dbus-broker-lau[743]: Ready
Dec 03 20:34:25 localhost systemd[1]: Starting NTP client/server...
Dec 03 20:34:25 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 03 20:34:25 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 03 20:34:25 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 03 20:34:25 localhost systemd[1]: Started irqbalance daemon.
Dec 03 20:34:25 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 03 20:34:25 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 03 20:34:25 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 03 20:34:25 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 03 20:34:25 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 03 20:34:25 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 03 20:34:25 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 03 20:34:25 localhost systemd[1]: Starting User Login Management...
Dec 03 20:34:25 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 03 20:34:25 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 03 20:34:25 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 03 20:34:25 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 03 20:34:25 localhost chronyd[798]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 03 20:34:25 localhost systemd[1]: Started NTP client/server.
Dec 03 20:34:25 localhost chronyd[798]: Loaded 0 symmetric keys
Dec 03 20:34:25 localhost chronyd[798]: Using right/UTC timezone to obtain leap second data
Dec 03 20:34:25 localhost chronyd[798]: Loaded seccomp filter (level 2)
Dec 03 20:34:25 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 03 20:34:25 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 03 20:34:25 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 03 20:34:25 localhost systemd-logind[787]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 03 20:34:25 localhost systemd-logind[787]: New seat seat0.
Dec 03 20:34:25 localhost systemd[1]: Started User Login Management.
Dec 03 20:34:25 localhost systemd-logind[787]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 03 20:34:26 localhost kernel: kvm_amd: TSC scaling supported
Dec 03 20:34:26 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 03 20:34:26 localhost kernel: kvm_amd: Nested Paging enabled
Dec 03 20:34:26 localhost kernel: kvm_amd: LBR virtualization supported
Dec 03 20:34:26 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 03 20:34:26 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 03 20:34:26 localhost kernel: Console: switching to colour dummy device 80x25
Dec 03 20:34:26 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 03 20:34:26 localhost kernel: [drm] features: -context_init
Dec 03 20:34:26 localhost kernel: [drm] number of scanouts: 1
Dec 03 20:34:26 localhost kernel: [drm] number of cap sets: 0
Dec 03 20:34:26 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 03 20:34:26 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 03 20:34:26 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 03 20:34:26 localhost iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Dec 03 20:34:26 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 03 20:34:26 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 03 20:34:26 localhost cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Wed, 03 Dec 2025 20:34:26 +0000. Up 5.89 seconds.
Dec 03 20:34:26 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 03 20:34:26 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 03 20:34:26 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp7wgqn6ps.mount: Deactivated successfully.
Dec 03 20:34:26 localhost systemd[1]: Starting Hostname Service...
Dec 03 20:34:26 localhost systemd[1]: Started Hostname Service.
Dec 03 20:34:26 np0005544708.novalocal systemd-hostnamed[856]: Hostname set to <np0005544708.novalocal> (static)
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Reached target Preparation for Network.
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Starting Network Manager...
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8691] NetworkManager (version 1.54.1-1.el9) is starting... (boot:cb12512a-3aa8-4735-9c82-f409c246c155)
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8697] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8778] manager[0x5556d1397080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8812] hostname: hostname: using hostnamed
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8812] hostname: static hostname changed from (none) to "np0005544708.novalocal"
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8817] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8913] manager[0x5556d1397080]: rfkill: Wi-Fi hardware radio set enabled
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8914] manager[0x5556d1397080]: rfkill: WWAN hardware radio set enabled
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8956] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8957] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8959] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8960] manager: Networking is enabled by state file
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8962] settings: Loaded settings plugin: keyfile (internal)
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8976] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.8996] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9011] dhcp: init: Using DHCP client 'internal'
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9014] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9031] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9040] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9049] device (lo): Activation: starting connection 'lo' (2dc71c9a-e258-42ef-b117-38009802277f)
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9059] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9063] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9089] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9095] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9099] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9102] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9104] device (eth0): carrier: link connected
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9109] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9116] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9123] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9128] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9129] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9132] manager: NetworkManager state is now CONNECTING
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9133] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9144] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9147] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Started Network Manager.
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Reached target Network.
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9191] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9200] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9221] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9389] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9392] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9393] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9400] device (lo): Activation: successful, device activated.
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9406] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9410] manager: NetworkManager state is now CONNECTED_SITE
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9414] device (eth0): Activation: successful, device activated.
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9419] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 03 20:34:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794066.9422] manager: startup complete
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Reached target NFS client services.
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Reached target Remote File Systems.
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 03 20:34:26 np0005544708.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Wed, 03 Dec 2025 20:34:27 +0000. Up 6.91 seconds.
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.219         | 255.255.255.0 | global | fa:16:3e:28:69:f5 |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fe28:69f5/64 |       .       |  link  | fa:16:3e:28:69:f5 |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 03 20:34:27 np0005544708.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 03 20:34:28 np0005544708.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Dec 03 20:34:28 np0005544708.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 03 20:34:28 np0005544708.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Dec 03 20:34:28 np0005544708.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Dec 03 20:34:28 np0005544708.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Dec 03 20:34:28 np0005544708.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Generating public/private rsa key pair.
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: The key fingerprint is:
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: SHA256:YAOdlDquc6Fe8SNA1LKtvklICtWqcMSeI1+YY8n33tM root@np0005544708.novalocal
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: The key's randomart image is:
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: +---[RSA 3072]----+
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |  ...o.o         |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: | o....+          |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |  =+..+          |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: | *.*+. o         |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |+.#o+.  S        |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |=B.*o+           |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |+.ooo.+  .       |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: | .=o.o o. E      |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: | .++  . ..       |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Generating public/private ecdsa key pair.
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: The key fingerprint is:
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: SHA256:0oqKrqywu4tzYmMxViC3pHm53h4s8MeGMaGkrx8VNiY root@np0005544708.novalocal
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: The key's randomart image is:
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: +---[ECDSA 256]---+
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |                 |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |..o              |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |.*Eo=            |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |=.+* o .         |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |+.+.. . S        |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: | B.B . o         |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |o.O.B .          |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |*Bo*..           |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |^@+..            |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Generating public/private ed25519 key pair.
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: The key fingerprint is:
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: SHA256:G3SRB4rAJKwfwBspc1v9P/VfxSmeg7EUWPrf5xVp1sE root@np0005544708.novalocal
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: The key's randomart image is:
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: +--[ED25519 256]--+
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |..ooo.    =+     |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |++o.o... ooo. .  |
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |.=oo  ..o....  E.|
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |..o    ....o.. .*|
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: | . .    S..o=.o=o|
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |  .      ooo.+= o|
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |        .  . ..o+|
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |               .+|
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: |                .|
Dec 03 20:34:28 np0005544708.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Reached target Network is Online.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Starting System Logging Service...
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 03 20:34:28 np0005544708.novalocal sm-notify[1005]: Version 2.5.4 starting
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Starting Permit User Sessions...
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 03 20:34:28 np0005544708.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Dec 03 20:34:28 np0005544708.novalocal sshd[1007]: Server listening on :: port 22.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Finished Permit User Sessions.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Started Command Scheduler.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Started Getty on tty1.
Dec 03 20:34:28 np0005544708.novalocal crond[1011]: (CRON) STARTUP (1.5.7)
Dec 03 20:34:28 np0005544708.novalocal crond[1011]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 03 20:34:28 np0005544708.novalocal crond[1011]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 77% if used.)
Dec 03 20:34:28 np0005544708.novalocal crond[1011]: (CRON) INFO (running with inotify support)
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Reached target Login Prompts.
Dec 03 20:34:28 np0005544708.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Started System Logging Service.
Dec 03 20:34:28 np0005544708.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Reached target Multi-User System.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1028]: Unable to negotiate with 38.102.83.114 port 46786: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 03 20:34:28 np0005544708.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1034]: Connection reset by 38.102.83.114 port 46798 [preauth]
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1042]: Unable to negotiate with 38.102.83.114 port 46806: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1051]: Unable to negotiate with 38.102.83.114 port 46818: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1060]: Connection reset by 38.102.83.114 port 46824 [preauth]
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1015]: Connection closed by 38.102.83.114 port 46778 [preauth]
Dec 03 20:34:28 np0005544708.novalocal kdumpctl[1021]: kdump: No kdump initial ramdisk found.
Dec 03 20:34:28 np0005544708.novalocal kdumpctl[1021]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1096]: Unable to negotiate with 38.102.83.114 port 46842: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1112]: Unable to negotiate with 38.102.83.114 port 46856: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 03 20:34:28 np0005544708.novalocal cloud-init[1154]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Wed, 03 Dec 2025 20:34:28 +0000. Up 8.41 seconds.
Dec 03 20:34:28 np0005544708.novalocal sshd-session[1076]: Connection closed by 38.102.83.114 port 46838 [preauth]
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 03 20:34:28 np0005544708.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 03 20:34:29 np0005544708.novalocal dracut[1285]: dracut-057-102.git20250818.el9
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1303]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Wed, 03 Dec 2025 20:34:29 +0000. Up 8.79 seconds.
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1310]: #############################################################
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1315]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1324]: 256 SHA256:0oqKrqywu4tzYmMxViC3pHm53h4s8MeGMaGkrx8VNiY root@np0005544708.novalocal (ECDSA)
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1329]: 256 SHA256:G3SRB4rAJKwfwBspc1v9P/VfxSmeg7EUWPrf5xVp1sE root@np0005544708.novalocal (ED25519)
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1333]: 3072 SHA256:YAOdlDquc6Fe8SNA1LKtvklICtWqcMSeI1+YY8n33tM root@np0005544708.novalocal (RSA)
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1336]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1338]: #############################################################
Dec 03 20:34:29 np0005544708.novalocal cloud-init[1303]: Cloud-init v. 24.4-7.el9 finished at Wed, 03 Dec 2025 20:34:29 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 8.97 seconds
Dec 03 20:34:29 np0005544708.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 03 20:34:29 np0005544708.novalocal systemd[1]: Reached target Cloud-init target.
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 03 20:34:29 np0005544708.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: memstrack is not available
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: memstrack is not available
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 03 20:34:30 np0005544708.novalocal dracut[1287]: *** Including module: systemd ***
Dec 03 20:34:31 np0005544708.novalocal dracut[1287]: *** Including module: fips ***
Dec 03 20:34:31 np0005544708.novalocal dracut[1287]: *** Including module: systemd-initrd ***
Dec 03 20:34:31 np0005544708.novalocal dracut[1287]: *** Including module: i18n ***
Dec 03 20:34:31 np0005544708.novalocal dracut[1287]: *** Including module: drm ***
Dec 03 20:34:31 np0005544708.novalocal chronyd[798]: Selected source 174.142.148.226 (2.centos.pool.ntp.org)
Dec 03 20:34:31 np0005544708.novalocal chronyd[798]: System clock TAI offset set to 37 seconds
Dec 03 20:34:32 np0005544708.novalocal dracut[1287]: *** Including module: prefixdevname ***
Dec 03 20:34:32 np0005544708.novalocal dracut[1287]: *** Including module: kernel-modules ***
Dec 03 20:34:32 np0005544708.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 03 20:34:32 np0005544708.novalocal dracut[1287]: *** Including module: kernel-modules-extra ***
Dec 03 20:34:32 np0005544708.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 03 20:34:32 np0005544708.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 03 20:34:32 np0005544708.novalocal dracut[1287]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 03 20:34:32 np0005544708.novalocal dracut[1287]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 03 20:34:32 np0005544708.novalocal dracut[1287]: *** Including module: qemu ***
Dec 03 20:34:33 np0005544708.novalocal dracut[1287]: *** Including module: fstab-sys ***
Dec 03 20:34:33 np0005544708.novalocal dracut[1287]: *** Including module: rootfs-block ***
Dec 03 20:34:33 np0005544708.novalocal dracut[1287]: *** Including module: terminfo ***
Dec 03 20:34:33 np0005544708.novalocal dracut[1287]: *** Including module: udev-rules ***
Dec 03 20:34:33 np0005544708.novalocal dracut[1287]: Skipping udev rule: 91-permissions.rules
Dec 03 20:34:33 np0005544708.novalocal dracut[1287]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 03 20:34:33 np0005544708.novalocal dracut[1287]: *** Including module: virtiofs ***
Dec 03 20:34:33 np0005544708.novalocal dracut[1287]: *** Including module: dracut-systemd ***
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]: *** Including module: usrmount ***
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]: *** Including module: base ***
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]: *** Including module: fs-lib ***
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]: *** Including module: kdumpbase ***
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:   microcode_ctl module: mangling fw_dir
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel" is ignored
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 03 20:34:34 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]: *** Including module: openssl ***
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]: *** Including module: shutdown ***
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]: *** Including module: squash ***
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]: *** Including modules done ***
Dec 03 20:34:35 np0005544708.novalocal dracut[1287]: *** Installing kernel module dependencies ***
Dec 03 20:34:36 np0005544708.novalocal dracut[1287]: *** Installing kernel module dependencies done ***
Dec 03 20:34:36 np0005544708.novalocal dracut[1287]: *** Resolving executable dependencies ***
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: Cannot change IRQ 35 affinity: Operation not permitted
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: IRQ 35 affinity is now unmanaged
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: IRQ 25 affinity is now unmanaged
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: Cannot change IRQ 33 affinity: Operation not permitted
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: IRQ 33 affinity is now unmanaged
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: IRQ 31 affinity is now unmanaged
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: Cannot change IRQ 34 affinity: Operation not permitted
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: IRQ 34 affinity is now unmanaged
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: IRQ 32 affinity is now unmanaged
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: IRQ 30 affinity is now unmanaged
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 03 20:34:36 np0005544708.novalocal irqbalance[782]: IRQ 29 affinity is now unmanaged
Dec 03 20:34:37 np0005544708.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 03 20:34:37 np0005544708.novalocal dracut[1287]: *** Resolving executable dependencies done ***
Dec 03 20:34:37 np0005544708.novalocal dracut[1287]: *** Generating early-microcode cpio image ***
Dec 03 20:34:37 np0005544708.novalocal dracut[1287]: *** Store current command line parameters ***
Dec 03 20:34:37 np0005544708.novalocal dracut[1287]: Stored kernel commandline:
Dec 03 20:34:37 np0005544708.novalocal dracut[1287]: No dracut internal kernel commandline stored in the initramfs
Dec 03 20:34:37 np0005544708.novalocal dracut[1287]: *** Install squash loader ***
Dec 03 20:34:38 np0005544708.novalocal dracut[1287]: *** Squashing the files inside the initramfs ***
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: *** Squashing the files inside the initramfs done ***
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: *** Hardlinking files ***
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: Mode:           real
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: Files:          50
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: Linked:         0 files
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: Compared:       0 xattrs
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: Compared:       0 files
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: Saved:          0 B
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: Duration:       0.001270 seconds
Dec 03 20:34:39 np0005544708.novalocal dracut[1287]: *** Hardlinking files done ***
Dec 03 20:34:40 np0005544708.novalocal dracut[1287]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec 03 20:34:40 np0005544708.novalocal kdumpctl[1021]: kdump: kexec: loaded kdump kernel
Dec 03 20:34:40 np0005544708.novalocal kdumpctl[1021]: kdump: Starting kdump: [OK]
Dec 03 20:34:40 np0005544708.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 03 20:34:40 np0005544708.novalocal systemd[1]: Startup finished in 1.513s (kernel) + 2.426s (initrd) + 16.552s (userspace) = 20.491s.
Dec 03 20:34:43 np0005544708.novalocal sshd-session[4297]: Accepted publickey for zuul from 38.102.83.114 port 43612 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 03 20:34:43 np0005544708.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 03 20:34:43 np0005544708.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 03 20:34:43 np0005544708.novalocal systemd-logind[787]: New session 1 of user zuul.
Dec 03 20:34:43 np0005544708.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 03 20:34:43 np0005544708.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Queued start job for default target Main User Target.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Created slice User Application Slice.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Started Daily Cleanup of User's Temporary Directories.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Reached target Paths.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Reached target Timers.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Starting D-Bus User Message Bus Socket...
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Starting Create User's Volatile Files and Directories...
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Finished Create User's Volatile Files and Directories.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Listening on D-Bus User Message Bus Socket.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Reached target Sockets.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Reached target Basic System.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Reached target Main User Target.
Dec 03 20:34:43 np0005544708.novalocal systemd[4301]: Startup finished in 107ms.
Dec 03 20:34:43 np0005544708.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 03 20:34:43 np0005544708.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 03 20:34:43 np0005544708.novalocal sshd-session[4297]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 20:34:43 np0005544708.novalocal python3[4383]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:34:46 np0005544708.novalocal python3[4411]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:34:52 np0005544708.novalocal python3[4469]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:34:52 np0005544708.novalocal python3[4509]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 03 20:34:54 np0005544708.novalocal python3[4535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwU1XEfVDogsv6Y2JkZySMwq4Zdohkns3qBSg0XZ4yFEAOoqZTnyPrnCKWaH3Im/T599uliDyAHCYxlL6OopZ/VWx95YbCuoI/yWVcgeEyF+++N6GlQQnBVQcmvA7B0Mvv0wQfvmyE2+SOtTYySvBBUayoBE5AcQxi3hiXg2cegKwOdg/iepD9KMibLthbj40MXgn1e88YaS8jmBUIIAtx7rHFDvRugPF8YtbeW8k3nkZxqlZRFr7yqETQIEwKC3o3fbVYOMgV+7l0ep6A0TUKktH5h8YXnQqSzafMqseDnwb2Lu8WaszB/3k887xp9Lc3Q+Wl1p4xJCK5oVvG2oZceTSf7imAFaVubxK7bzUagWdMM3K8lAHy6GIBBuUKl9ePwrZTYITAP724k3XR1mm6Ind0GLmZFUvA3oC5B+G9Jay8Z9MO88r+xSDy5fusvo1ygMxKqUTuOt1amdLlDy0n95O5VxSzTVtGvIk2s3tzF0J7aUtHCqcak63BRTevb5c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:34:55 np0005544708.novalocal python3[4559]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:34:55 np0005544708.novalocal python3[4658]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:34:55 np0005544708.novalocal python3[4729]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764794095.2719893-207-38996794071089/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=212c8454592e4879b062383806272265_id_rsa follow=False checksum=744ad42b3431e855d34445ed08d0cada55a7c21f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:34:56 np0005544708.novalocal python3[4852]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:34:56 np0005544708.novalocal python3[4923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764794096.1869526-240-235335856868184/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=212c8454592e4879b062383806272265_id_rsa.pub follow=False checksum=afffad914350138a10afbc72b08c3c0848ab6f39 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:34:56 np0005544708.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 03 20:34:58 np0005544708.novalocal python3[4973]: ansible-ping Invoked with data=pong
Dec 03 20:34:58 np0005544708.novalocal python3[4997]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:35:00 np0005544708.novalocal python3[5055]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 03 20:35:01 np0005544708.novalocal python3[5087]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:01 np0005544708.novalocal python3[5111]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:01 np0005544708.novalocal python3[5135]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:02 np0005544708.novalocal python3[5159]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:02 np0005544708.novalocal python3[5183]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:02 np0005544708.novalocal python3[5207]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:04 np0005544708.novalocal sudo[5231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpspekoakmcqgqvswlnjuobuldhwlshl ; /usr/bin/python3'
Dec 03 20:35:04 np0005544708.novalocal sudo[5231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:04 np0005544708.novalocal python3[5233]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:04 np0005544708.novalocal sudo[5231]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:04 np0005544708.novalocal sudo[5309]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbvgjjzsgcptedephwkzjxdznnnxceff ; /usr/bin/python3'
Dec 03 20:35:04 np0005544708.novalocal sudo[5309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:04 np0005544708.novalocal python3[5311]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:35:04 np0005544708.novalocal sudo[5309]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:05 np0005544708.novalocal sudo[5382]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pevlixjpmhtyfhgzvmjytqclhozumeii ; /usr/bin/python3'
Dec 03 20:35:05 np0005544708.novalocal sudo[5382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:05 np0005544708.novalocal python3[5384]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764794104.3639655-21-41117920850078/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:05 np0005544708.novalocal sudo[5382]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:05 np0005544708.novalocal python3[5432]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:06 np0005544708.novalocal python3[5456]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:06 np0005544708.novalocal python3[5480]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:06 np0005544708.novalocal python3[5504]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:06 np0005544708.novalocal python3[5528]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:07 np0005544708.novalocal python3[5552]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:07 np0005544708.novalocal python3[5576]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:07 np0005544708.novalocal python3[5600]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:07 np0005544708.novalocal python3[5624]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:08 np0005544708.novalocal python3[5648]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:08 np0005544708.novalocal python3[5672]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:08 np0005544708.novalocal python3[5696]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:09 np0005544708.novalocal python3[5720]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:09 np0005544708.novalocal python3[5744]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:09 np0005544708.novalocal python3[5768]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:09 np0005544708.novalocal python3[5792]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:10 np0005544708.novalocal python3[5816]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:10 np0005544708.novalocal python3[5840]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:10 np0005544708.novalocal python3[5864]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:10 np0005544708.novalocal python3[5888]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:11 np0005544708.novalocal python3[5912]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:11 np0005544708.novalocal python3[5936]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:11 np0005544708.novalocal python3[5960]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:12 np0005544708.novalocal python3[5984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:12 np0005544708.novalocal python3[6008]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:12 np0005544708.novalocal python3[6032]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:35:15 np0005544708.novalocal sudo[6056]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzdyxmufnvsarhsjyempdcgewehnvmqw ; /usr/bin/python3'
Dec 03 20:35:15 np0005544708.novalocal sudo[6056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:15 np0005544708.novalocal python3[6058]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 03 20:35:15 np0005544708.novalocal systemd[1]: Starting Time & Date Service...
Dec 03 20:35:15 np0005544708.novalocal systemd[1]: Started Time & Date Service.
Dec 03 20:35:15 np0005544708.novalocal systemd-timedated[6060]: Changed time zone to 'UTC' (UTC).
Dec 03 20:35:15 np0005544708.novalocal sudo[6056]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:15 np0005544708.novalocal sudo[6087]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzcfhaubchuxqjemfnotdxhwzeifdmcz ; /usr/bin/python3'
Dec 03 20:35:16 np0005544708.novalocal sudo[6087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:16 np0005544708.novalocal python3[6089]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:16 np0005544708.novalocal sudo[6087]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:16 np0005544708.novalocal python3[6165]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:35:16 np0005544708.novalocal python3[6236]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764794116.31435-153-170510912668961/source _original_basename=tmpta0iq8a5 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:17 np0005544708.novalocal python3[6336]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:35:17 np0005544708.novalocal python3[6407]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764794117.1395614-183-275094820537795/source _original_basename=tmpv1kyb819 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:18 np0005544708.novalocal sudo[6507]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkejtjidatthuigfjioyrgkzcppdlehc ; /usr/bin/python3'
Dec 03 20:35:18 np0005544708.novalocal sudo[6507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:18 np0005544708.novalocal python3[6509]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:35:18 np0005544708.novalocal sudo[6507]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:18 np0005544708.novalocal sudo[6580]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bihsdmwtlchfxbxouxdjuxtfbnijanim ; /usr/bin/python3'
Dec 03 20:35:18 np0005544708.novalocal sudo[6580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:18 np0005544708.novalocal python3[6582]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764794118.2228136-231-11114428022070/source _original_basename=tmpnej6c4z7 follow=False checksum=873438299bb17ff1128a56bbeb324b7beaf57647 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:18 np0005544708.novalocal sudo[6580]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:19 np0005544708.novalocal python3[6630]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:35:19 np0005544708.novalocal python3[6656]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:35:20 np0005544708.novalocal sudo[6734]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knwnjkipeekhtxqnhdhbtkbtnrbimwgw ; /usr/bin/python3'
Dec 03 20:35:20 np0005544708.novalocal sudo[6734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:20 np0005544708.novalocal python3[6736]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:35:20 np0005544708.novalocal sudo[6734]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:20 np0005544708.novalocal sudo[6807]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bozsosauekovzjwocwtdsnlxhvwgrdud ; /usr/bin/python3'
Dec 03 20:35:20 np0005544708.novalocal sudo[6807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:20 np0005544708.novalocal python3[6809]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764794119.8686967-273-38026006013058/source _original_basename=tmp38hiw7_r follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:20 np0005544708.novalocal sudo[6807]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:20 np0005544708.novalocal sudo[6858]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rysegoylyyetnsavcikqodvesxsxiizq ; /usr/bin/python3'
Dec 03 20:35:20 np0005544708.novalocal sudo[6858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:21 np0005544708.novalocal python3[6860]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-63dc-f7bd-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:35:21 np0005544708.novalocal sudo[6858]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:21 np0005544708.novalocal python3[6888]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-63dc-f7bd-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 03 20:35:22 np0005544708.novalocal python3[6916]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:39 np0005544708.novalocal sudo[6940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhmlmusvkplrylnohkhqqpohfigxaase ; /usr/bin/python3'
Dec 03 20:35:39 np0005544708.novalocal sudo[6940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:35:39 np0005544708.novalocal python3[6942]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:35:39 np0005544708.novalocal sudo[6940]: pam_unix(sudo:session): session closed for user root
Dec 03 20:35:45 np0005544708.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 03 20:36:14 np0005544708.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 03 20:36:14 np0005544708.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7450] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 03 20:36:14 np0005544708.novalocal systemd-udevd[6945]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7598] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7618] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7621] device (eth1): carrier: link connected
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7623] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7628] policy: auto-activating connection 'Wired connection 1' (3fbb02b6-2141-3242-b016-afde93023b71)
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7631] device (eth1): Activation: starting connection 'Wired connection 1' (3fbb02b6-2141-3242-b016-afde93023b71)
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7632] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7634] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7637] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 20:36:14 np0005544708.novalocal NetworkManager[860]: <info>  [1764794174.7641] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 03 20:36:15 np0005544708.novalocal python3[6972]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-ef37-d628-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:36:25 np0005544708.novalocal sudo[7050]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbgemvqdzegbsywvexanlwiiaiogznwn ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 03 20:36:25 np0005544708.novalocal sudo[7050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:36:25 np0005544708.novalocal python3[7052]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:36:25 np0005544708.novalocal sudo[7050]: pam_unix(sudo:session): session closed for user root
Dec 03 20:36:25 np0005544708.novalocal sudo[7123]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfknmiihgonuiakpfymrpbtqzmqtomtk ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 03 20:36:25 np0005544708.novalocal sudo[7123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:36:25 np0005544708.novalocal python3[7125]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764794185.3297126-102-241354330614831/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=d896b078404086040eb34304c2daff8442162aca backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:36:25 np0005544708.novalocal sudo[7123]: pam_unix(sudo:session): session closed for user root
Dec 03 20:36:26 np0005544708.novalocal sudo[7173]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yszjpwjmvundiswrdehdnlsyivqugyhz ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 03 20:36:26 np0005544708.novalocal sudo[7173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:36:26 np0005544708.novalocal python3[7175]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Stopping Network Manager...
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794186.8048] caught SIGTERM, shutting down normally.
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794186.8054] dhcp4 (eth0): canceled DHCP transaction
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794186.8054] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794186.8054] dhcp4 (eth0): state changed no lease
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794186.8056] manager: NetworkManager state is now CONNECTING
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794186.8156] dhcp4 (eth1): canceled DHCP transaction
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794186.8156] dhcp4 (eth1): state changed no lease
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[860]: <info>  [1764794186.8215] exiting (success)
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Stopped Network Manager.
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Starting Network Manager...
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.8654] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:cb12512a-3aa8-4735-9c82-f409c246c155)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.8657] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.8703] manager[0x560886f5b070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Starting Hostname Service...
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Started Hostname Service.
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9492] hostname: hostname: using hostnamed
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9495] hostname: static hostname changed from (none) to "np0005544708.novalocal"
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9499] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9504] manager[0x560886f5b070]: rfkill: Wi-Fi hardware radio set enabled
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9505] manager[0x560886f5b070]: rfkill: WWAN hardware radio set enabled
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9534] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9535] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9536] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9536] manager: Networking is enabled by state file
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9538] settings: Loaded settings plugin: keyfile (internal)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9542] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9572] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9582] dhcp: init: Using DHCP client 'internal'
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9585] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9591] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9597] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9605] device (lo): Activation: starting connection 'lo' (2dc71c9a-e258-42ef-b117-38009802277f)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9611] device (eth0): carrier: link connected
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9616] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9621] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9621] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9629] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9636] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9642] device (eth1): carrier: link connected
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9646] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9651] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (3fbb02b6-2141-3242-b016-afde93023b71) (indicated)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9652] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9658] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9665] device (eth1): Activation: starting connection 'Wired connection 1' (3fbb02b6-2141-3242-b016-afde93023b71)
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Started Network Manager.
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9671] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9676] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9679] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9681] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9683] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9686] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9689] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9691] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9695] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9702] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9705] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9713] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9715] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9729] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9733] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9739] device (lo): Activation: successful, device activated.
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9747] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9754] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 03 20:36:26 np0005544708.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9819] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9839] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9841] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9844] manager: NetworkManager state is now CONNECTED_SITE
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9849] device (eth0): Activation: successful, device activated.
Dec 03 20:36:26 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794186.9854] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 03 20:36:26 np0005544708.novalocal sudo[7173]: pam_unix(sudo:session): session closed for user root
Dec 03 20:36:27 np0005544708.novalocal python3[7259]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-ef37-d628-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:36:37 np0005544708.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 03 20:36:56 np0005544708.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.3689] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 03 20:37:12 np0005544708.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 03 20:37:12 np0005544708.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4046] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4048] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4054] device (eth1): Activation: successful, device activated.
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4063] manager: startup complete
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4067] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <warn>  [1764794232.4073] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4086] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 03 20:37:12 np0005544708.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4186] dhcp4 (eth1): canceled DHCP transaction
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4186] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4186] dhcp4 (eth1): state changed no lease
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4197] policy: auto-activating connection 'ci-private-network' (82caecc5-1713-50fd-827f-a8910de7f4a3)
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4201] device (eth1): Activation: starting connection 'ci-private-network' (82caecc5-1713-50fd-827f-a8910de7f4a3)
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4202] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4204] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4209] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4215] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4606] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4608] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 20:37:12 np0005544708.novalocal NetworkManager[7187]: <info>  [1764794232.4614] device (eth1): Activation: successful, device activated.
Dec 03 20:37:22 np0005544708.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 03 20:37:24 np0005544708.novalocal sudo[7362]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpmifudzolzjerodbrjzhrehjfxlagxw ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 03 20:37:24 np0005544708.novalocal sudo[7362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:37:25 np0005544708.novalocal python3[7364]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:37:25 np0005544708.novalocal sudo[7362]: pam_unix(sudo:session): session closed for user root
Dec 03 20:37:25 np0005544708.novalocal sudo[7435]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbcnoiuckcchgdijcaroqrzuzvytzcbs ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 03 20:37:25 np0005544708.novalocal sudo[7435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:37:25 np0005544708.novalocal python3[7437]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764794244.8579648-267-269166847295127/source _original_basename=tmpttdtq7zu follow=False checksum=04d30233cf826d59f5b8db4451ae2768a3645fb5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:37:25 np0005544708.novalocal sudo[7435]: pam_unix(sudo:session): session closed for user root
Dec 03 20:37:38 np0005544708.novalocal systemd[4301]: Starting Mark boot as successful...
Dec 03 20:37:38 np0005544708.novalocal systemd[4301]: Finished Mark boot as successful.
Dec 03 20:38:25 np0005544708.novalocal sshd-session[4310]: Received disconnect from 38.102.83.114 port 43612:11: disconnected by user
Dec 03 20:38:25 np0005544708.novalocal sshd-session[4310]: Disconnected from user zuul 38.102.83.114 port 43612
Dec 03 20:38:25 np0005544708.novalocal sshd-session[4297]: pam_unix(sshd:session): session closed for user zuul
Dec 03 20:38:25 np0005544708.novalocal systemd-logind[787]: Session 1 logged out. Waiting for processes to exit.
Dec 03 20:40:38 np0005544708.novalocal systemd[4301]: Created slice User Background Tasks Slice.
Dec 03 20:40:38 np0005544708.novalocal systemd[4301]: Starting Cleanup of User's Temporary Files and Directories...
Dec 03 20:40:38 np0005544708.novalocal systemd[4301]: Finished Cleanup of User's Temporary Files and Directories.
Dec 03 20:42:18 np0005544708.novalocal sshd-session[7468]: Accepted publickey for zuul from 38.102.83.114 port 39302 ssh2: RSA SHA256:Cbt6DRjvlxgKyw9DjjqWJJ3+P4VAN6Cwz5dn2cu8Cgg
Dec 03 20:42:18 np0005544708.novalocal systemd-logind[787]: New session 3 of user zuul.
Dec 03 20:42:18 np0005544708.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 03 20:42:18 np0005544708.novalocal sshd-session[7468]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 20:42:18 np0005544708.novalocal sudo[7495]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cznulzspwppajijijhghhncdwjyqeshu ; /usr/bin/python3'
Dec 03 20:42:18 np0005544708.novalocal sudo[7495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:18 np0005544708.novalocal python3[7497]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-24ce-0cab-000000001cc0-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:42:18 np0005544708.novalocal sudo[7495]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:18 np0005544708.novalocal sudo[7523]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qytsecfjjnsqdxucqajawfojavqaocpl ; /usr/bin/python3'
Dec 03 20:42:18 np0005544708.novalocal sudo[7523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:18 np0005544708.novalocal python3[7525]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:42:18 np0005544708.novalocal sudo[7523]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:19 np0005544708.novalocal sudo[7549]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izneyjgzoptyibokpcbknnmfiewykczz ; /usr/bin/python3'
Dec 03 20:42:19 np0005544708.novalocal sudo[7549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:19 np0005544708.novalocal python3[7551]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:42:19 np0005544708.novalocal sudo[7549]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:19 np0005544708.novalocal sudo[7576]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfxakvpuyvjtxacfkqirehbwrztgnddp ; /usr/bin/python3'
Dec 03 20:42:19 np0005544708.novalocal sudo[7576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:19 np0005544708.novalocal python3[7578]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:42:19 np0005544708.novalocal sudo[7576]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:19 np0005544708.novalocal sudo[7602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmqoczdlcherbbljlllypluzdcjcocvj ; /usr/bin/python3'
Dec 03 20:42:19 np0005544708.novalocal sudo[7602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:19 np0005544708.novalocal python3[7604]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:42:19 np0005544708.novalocal sudo[7602]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:20 np0005544708.novalocal sudo[7628]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwhuxdgzyeycsgunqyoncgaucdadqhmt ; /usr/bin/python3'
Dec 03 20:42:20 np0005544708.novalocal sudo[7628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:20 np0005544708.novalocal python3[7630]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:42:20 np0005544708.novalocal sudo[7628]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:20 np0005544708.novalocal sudo[7706]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdozxdnfiqtyqtourmjhnybnascjzvuk ; /usr/bin/python3'
Dec 03 20:42:20 np0005544708.novalocal sudo[7706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:20 np0005544708.novalocal python3[7708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:42:20 np0005544708.novalocal sudo[7706]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:21 np0005544708.novalocal sudo[7779]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-argerkduvhlfalxwywtyyzasvdnrfzgt ; /usr/bin/python3'
Dec 03 20:42:21 np0005544708.novalocal sudo[7779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:21 np0005544708.novalocal python3[7781]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764794540.581451-466-97533957781398/source _original_basename=tmpau87und0 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:42:21 np0005544708.novalocal sudo[7779]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:21 np0005544708.novalocal sudo[7829]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hczyccnlqllpccggsmjhtnveujycchzm ; /usr/bin/python3'
Dec 03 20:42:21 np0005544708.novalocal sudo[7829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:22 np0005544708.novalocal python3[7831]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 20:42:22 np0005544708.novalocal systemd[1]: Reloading.
Dec 03 20:42:22 np0005544708.novalocal systemd-rc-local-generator[7852]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 20:42:22 np0005544708.novalocal sudo[7829]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:23 np0005544708.novalocal sudo[7885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyeeurmpjzqdqcnahbiwzhtzpihwwdez ; /usr/bin/python3'
Dec 03 20:42:23 np0005544708.novalocal sudo[7885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:23 np0005544708.novalocal python3[7887]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 03 20:42:23 np0005544708.novalocal sudo[7885]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:23 np0005544708.novalocal sudo[7911]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qujdatwcihgeirckpxcdufnbwjiifbsc ; /usr/bin/python3'
Dec 03 20:42:23 np0005544708.novalocal sudo[7911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:24 np0005544708.novalocal python3[7913]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:42:24 np0005544708.novalocal sudo[7911]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:24 np0005544708.novalocal sudo[7939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvdnqwvdqglyerewmzrmwublesvrtcxh ; /usr/bin/python3'
Dec 03 20:42:24 np0005544708.novalocal sudo[7939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:24 np0005544708.novalocal python3[7941]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:42:24 np0005544708.novalocal sudo[7939]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:24 np0005544708.novalocal sudo[7967]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siwcxorfdvrxyjwosbpvhlmyscdhifmg ; /usr/bin/python3'
Dec 03 20:42:24 np0005544708.novalocal sudo[7967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:24 np0005544708.novalocal python3[7969]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:42:24 np0005544708.novalocal sudo[7967]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:24 np0005544708.novalocal sudo[7995]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndaxsssznoinsvnkcqokhahiwdqykviq ; /usr/bin/python3'
Dec 03 20:42:24 np0005544708.novalocal sudo[7995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:24 np0005544708.novalocal python3[7997]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:42:24 np0005544708.novalocal sudo[7995]: pam_unix(sudo:session): session closed for user root
Dec 03 20:42:25 np0005544708.novalocal python3[8024]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ec2-ffbe-24ce-0cab-000000001cc7-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:42:25 np0005544708.novalocal python3[8054]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 20:42:27 np0005544708.novalocal sshd-session[7471]: Connection closed by 38.102.83.114 port 39302
Dec 03 20:42:27 np0005544708.novalocal sshd-session[7468]: pam_unix(sshd:session): session closed for user zuul
Dec 03 20:42:27 np0005544708.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 03 20:42:27 np0005544708.novalocal systemd[1]: session-3.scope: Consumed 4.226s CPU time.
Dec 03 20:42:27 np0005544708.novalocal systemd-logind[787]: Session 3 logged out. Waiting for processes to exit.
Dec 03 20:42:27 np0005544708.novalocal systemd-logind[787]: Removed session 3.
Dec 03 20:42:29 np0005544708.novalocal sshd-session[8059]: Accepted publickey for zuul from 38.102.83.114 port 58900 ssh2: RSA SHA256:Cbt6DRjvlxgKyw9DjjqWJJ3+P4VAN6Cwz5dn2cu8Cgg
Dec 03 20:42:29 np0005544708.novalocal systemd-logind[787]: New session 4 of user zuul.
Dec 03 20:42:29 np0005544708.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 03 20:42:29 np0005544708.novalocal sshd-session[8059]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 20:42:29 np0005544708.novalocal sudo[8086]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwjxjzzrsxkmqqlxtctrrbtctigftktf ; /usr/bin/python3'
Dec 03 20:42:29 np0005544708.novalocal sudo[8086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:42:29 np0005544708.novalocal python3[8088]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 03 20:42:44 np0005544708.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 03 20:42:44 np0005544708.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 20:42:44 np0005544708.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 03 20:42:44 np0005544708.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 20:42:44 np0005544708.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 03 20:42:44 np0005544708.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 20:42:44 np0005544708.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 20:42:44 np0005544708.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 20:42:53 np0005544708.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 03 20:42:53 np0005544708.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 20:42:53 np0005544708.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 03 20:42:53 np0005544708.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 20:42:53 np0005544708.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 03 20:42:53 np0005544708.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 20:42:53 np0005544708.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 20:42:53 np0005544708.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 20:43:02 np0005544708.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 03 20:43:02 np0005544708.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 20:43:02 np0005544708.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 03 20:43:02 np0005544708.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 20:43:02 np0005544708.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 03 20:43:02 np0005544708.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 20:43:02 np0005544708.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 20:43:02 np0005544708.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 20:43:03 np0005544708.novalocal setsebool[8154]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 03 20:43:03 np0005544708.novalocal setsebool[8154]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 03 20:43:14 np0005544708.novalocal kernel: SELinux:  Converting 388 SID table entries...
Dec 03 20:43:14 np0005544708.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 20:43:14 np0005544708.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 03 20:43:14 np0005544708.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 20:43:14 np0005544708.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 03 20:43:14 np0005544708.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 20:43:14 np0005544708.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 20:43:14 np0005544708.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 20:43:32 np0005544708.novalocal dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 03 20:43:32 np0005544708.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 20:43:32 np0005544708.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 03 20:43:32 np0005544708.novalocal systemd[1]: Reloading.
Dec 03 20:43:32 np0005544708.novalocal systemd-rc-local-generator[8911]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 20:43:32 np0005544708.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 03 20:43:34 np0005544708.novalocal sudo[8086]: pam_unix(sudo:session): session closed for user root
Dec 03 20:43:39 np0005544708.novalocal python3[13894]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ec2-ffbe-b733-f88c-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:43:40 np0005544708.novalocal kernel: evm: overlay not supported
Dec 03 20:43:40 np0005544708.novalocal systemd[4301]: Starting D-Bus User Message Bus...
Dec 03 20:43:40 np0005544708.novalocal dbus-broker-launch[14112]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 03 20:43:40 np0005544708.novalocal dbus-broker-launch[14112]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 03 20:43:40 np0005544708.novalocal systemd[4301]: Started D-Bus User Message Bus.
Dec 03 20:43:40 np0005544708.novalocal dbus-broker-lau[14112]: Ready
Dec 03 20:43:40 np0005544708.novalocal systemd[4301]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 03 20:43:40 np0005544708.novalocal systemd[4301]: Created slice Slice /user.
Dec 03 20:43:40 np0005544708.novalocal systemd[4301]: podman-14045.scope: unit configures an IP firewall, but not running as root.
Dec 03 20:43:40 np0005544708.novalocal systemd[4301]: (This warning is only shown for the first unit using IP firewalling.)
Dec 03 20:43:40 np0005544708.novalocal systemd[4301]: Started podman-14045.scope.
Dec 03 20:43:40 np0005544708.novalocal systemd[4301]: Started podman-pause-17e5c5a3.scope.
Dec 03 20:43:41 np0005544708.novalocal sudo[14617]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eratbwsakcvwwtznshaxclelzqpsgjdh ; /usr/bin/python3'
Dec 03 20:43:41 np0005544708.novalocal sudo[14617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:43:41 np0005544708.novalocal python3[14631]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.217:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.217:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:43:41 np0005544708.novalocal python3[14631]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 03 20:43:41 np0005544708.novalocal sudo[14617]: pam_unix(sudo:session): session closed for user root
Dec 03 20:43:41 np0005544708.novalocal sshd-session[8062]: Connection closed by 38.102.83.114 port 58900
Dec 03 20:43:41 np0005544708.novalocal sshd-session[8059]: pam_unix(sshd:session): session closed for user zuul
Dec 03 20:43:41 np0005544708.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 03 20:43:41 np0005544708.novalocal systemd[1]: session-4.scope: Consumed 59.611s CPU time.
Dec 03 20:43:41 np0005544708.novalocal systemd-logind[787]: Session 4 logged out. Waiting for processes to exit.
Dec 03 20:43:41 np0005544708.novalocal systemd-logind[787]: Removed session 4.
Dec 03 20:44:20 np0005544708.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 20:44:20 np0005544708.novalocal systemd[1]: Finished man-db-cache-update.service.
Dec 03 20:44:20 np0005544708.novalocal systemd[1]: man-db-cache-update.service: Consumed 58.221s CPU time.
Dec 03 20:44:20 np0005544708.novalocal systemd[1]: run-r6e0b67deac2841308050a4b0221d5f07.service: Deactivated successfully.
Dec 03 20:44:21 np0005544708.novalocal sshd-session[29567]: Connection closed by 38.102.83.47 port 38396 [preauth]
Dec 03 20:44:21 np0005544708.novalocal sshd-session[29564]: Unable to negotiate with 38.102.83.47 port 38412: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 03 20:44:21 np0005544708.novalocal sshd-session[29563]: Connection closed by 38.102.83.47 port 38380 [preauth]
Dec 03 20:44:21 np0005544708.novalocal sshd-session[29565]: Unable to negotiate with 38.102.83.47 port 38424: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 03 20:44:21 np0005544708.novalocal sshd-session[29566]: Unable to negotiate with 38.102.83.47 port 38434: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 03 20:44:24 np0005544708.novalocal sshd-session[29573]: Accepted publickey for zuul from 38.102.83.114 port 41398 ssh2: RSA SHA256:Cbt6DRjvlxgKyw9DjjqWJJ3+P4VAN6Cwz5dn2cu8Cgg
Dec 03 20:44:24 np0005544708.novalocal systemd-logind[787]: New session 5 of user zuul.
Dec 03 20:44:24 np0005544708.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 03 20:44:24 np0005544708.novalocal sshd-session[29573]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 20:44:25 np0005544708.novalocal python3[29600]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM0hVga1zRkUzrYCe7oc50WLLPKVDkVFkArpF5CarLZ9i3k6P99COH1nadZDO3eIJdFZ/LXbq11sH+72H0chG0g= zuul@np0005544707.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:44:25 np0005544708.novalocal sudo[29624]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amhebvuzbwpmvnxfghscriqtevnfmgzv ; /usr/bin/python3'
Dec 03 20:44:25 np0005544708.novalocal sudo[29624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:44:25 np0005544708.novalocal python3[29626]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM0hVga1zRkUzrYCe7oc50WLLPKVDkVFkArpF5CarLZ9i3k6P99COH1nadZDO3eIJdFZ/LXbq11sH+72H0chG0g= zuul@np0005544707.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:44:25 np0005544708.novalocal sudo[29624]: pam_unix(sudo:session): session closed for user root
Dec 03 20:44:26 np0005544708.novalocal sudo[29650]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjgmbcjbxqbmvbnatjfkgaudgdxpmgyu ; /usr/bin/python3'
Dec 03 20:44:26 np0005544708.novalocal sudo[29650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:44:26 np0005544708.novalocal python3[29652]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005544708.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 03 20:44:26 np0005544708.novalocal useradd[29654]: new group: name=cloud-admin, GID=1002
Dec 03 20:44:26 np0005544708.novalocal useradd[29654]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 03 20:44:26 np0005544708.novalocal sudo[29650]: pam_unix(sudo:session): session closed for user root
Dec 03 20:44:26 np0005544708.novalocal sudo[29684]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owsqwldvzlpgmjksuqpiwkikwzyyenmb ; /usr/bin/python3'
Dec 03 20:44:26 np0005544708.novalocal sudo[29684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:44:26 np0005544708.novalocal python3[29686]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM0hVga1zRkUzrYCe7oc50WLLPKVDkVFkArpF5CarLZ9i3k6P99COH1nadZDO3eIJdFZ/LXbq11sH+72H0chG0g= zuul@np0005544707.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 03 20:44:26 np0005544708.novalocal sudo[29684]: pam_unix(sudo:session): session closed for user root
Dec 03 20:44:27 np0005544708.novalocal sudo[29762]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqivxvakawpppyilmartaalgyphzjykd ; /usr/bin/python3'
Dec 03 20:44:27 np0005544708.novalocal sudo[29762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:44:27 np0005544708.novalocal python3[29764]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:44:27 np0005544708.novalocal sudo[29762]: pam_unix(sudo:session): session closed for user root
Dec 03 20:44:27 np0005544708.novalocal sudo[29835]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-butqpdejhykdwxxnzlyuikykesnylyyu ; /usr/bin/python3'
Dec 03 20:44:27 np0005544708.novalocal sudo[29835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:44:27 np0005544708.novalocal python3[29837]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764794666.9580677-137-37094700288810/source _original_basename=tmpg7otz_t1 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:44:27 np0005544708.novalocal sudo[29835]: pam_unix(sudo:session): session closed for user root
Dec 03 20:44:28 np0005544708.novalocal sudo[29885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmvsmotecqpanylbmcwipyhqvxdlthfl ; /usr/bin/python3'
Dec 03 20:44:28 np0005544708.novalocal sudo[29885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:44:28 np0005544708.novalocal python3[29887]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec 03 20:44:28 np0005544708.novalocal systemd[1]: Starting Hostname Service...
Dec 03 20:44:28 np0005544708.novalocal systemd[1]: Started Hostname Service.
Dec 03 20:44:28 np0005544708.novalocal systemd-hostnamed[29891]: Changed pretty hostname to 'compute-0'
Dec 03 20:44:28 compute-0 systemd-hostnamed[29891]: Hostname set to <compute-0> (static)
Dec 03 20:44:28 compute-0 NetworkManager[7187]: <info>  [1764794668.8887] hostname: static hostname changed from "np0005544708.novalocal" to "compute-0"
Dec 03 20:44:28 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 03 20:44:28 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 03 20:44:28 compute-0 sudo[29885]: pam_unix(sudo:session): session closed for user root
Dec 03 20:44:29 compute-0 sshd-session[29576]: Connection closed by 38.102.83.114 port 41398
Dec 03 20:44:29 compute-0 sshd-session[29573]: pam_unix(sshd:session): session closed for user zuul
Dec 03 20:44:29 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Dec 03 20:44:29 compute-0 systemd[1]: session-5.scope: Consumed 2.627s CPU time.
Dec 03 20:44:29 compute-0 systemd-logind[787]: Session 5 logged out. Waiting for processes to exit.
Dec 03 20:44:29 compute-0 systemd-logind[787]: Removed session 5.
Dec 03 20:44:38 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 03 20:44:58 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 03 20:45:53 compute-0 sshd-session[29910]: Received disconnect from 80.94.93.119 port 21122:11:  [preauth]
Dec 03 20:45:53 compute-0 sshd-session[29910]: Disconnected from authenticating user root 80.94.93.119 port 21122 [preauth]
Dec 03 20:48:13 compute-0 sshd-session[29914]: Accepted publickey for zuul from 38.102.83.47 port 44370 ssh2: RSA SHA256:Cbt6DRjvlxgKyw9DjjqWJJ3+P4VAN6Cwz5dn2cu8Cgg
Dec 03 20:48:13 compute-0 systemd-logind[787]: New session 6 of user zuul.
Dec 03 20:48:13 compute-0 systemd[1]: Started Session 6 of User zuul.
Dec 03 20:48:13 compute-0 sshd-session[29914]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 20:48:14 compute-0 python3[29990]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:48:16 compute-0 sudo[30104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsceczjspzjilmbgcnxipwxyehzgmiev ; /usr/bin/python3'
Dec 03 20:48:16 compute-0 sudo[30104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:16 compute-0 python3[30106]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:48:16 compute-0 sudo[30104]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:16 compute-0 sudo[30177]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fokqfhcfixkunhsomwyfdgqrcgtpfhad ; /usr/bin/python3'
Dec 03 20:48:16 compute-0 sudo[30177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:17 compute-0 python3[30179]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764794896.1794226-33662-64347837682086/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:48:17 compute-0 sudo[30177]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:17 compute-0 sudo[30203]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbzebgzynedhtaspxachaozhszsgyfye ; /usr/bin/python3'
Dec 03 20:48:17 compute-0 sudo[30203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:17 compute-0 python3[30205]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:48:17 compute-0 sudo[30203]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:17 compute-0 sudo[30276]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnjxqzdeznlkteiapjjhxwxunhazvflg ; /usr/bin/python3'
Dec 03 20:48:17 compute-0 sudo[30276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:17 compute-0 python3[30278]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764794896.1794226-33662-64347837682086/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:48:17 compute-0 sudo[30276]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:17 compute-0 sudo[30302]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrygizcuzztmxikmryhgntmerychyxba ; /usr/bin/python3'
Dec 03 20:48:17 compute-0 sudo[30302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:17 compute-0 python3[30304]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:48:17 compute-0 sudo[30302]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:18 compute-0 sudo[30375]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgnxlqylsmdvxyqzpniryhnogkjhimwe ; /usr/bin/python3'
Dec 03 20:48:18 compute-0 sudo[30375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:18 compute-0 python3[30377]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764794896.1794226-33662-64347837682086/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:48:18 compute-0 sudo[30375]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:18 compute-0 sudo[30401]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofcfjogszkicyjvhncqhbfzqxuiihocy ; /usr/bin/python3'
Dec 03 20:48:18 compute-0 sudo[30401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:18 compute-0 python3[30403]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:48:18 compute-0 sudo[30401]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:18 compute-0 sudo[30474]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atvzdjujjvusnwdvzcqmntdwgrsehves ; /usr/bin/python3'
Dec 03 20:48:18 compute-0 sudo[30474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:18 compute-0 python3[30476]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764794896.1794226-33662-64347837682086/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:48:18 compute-0 sudo[30474]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:18 compute-0 sudo[30500]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prvqumwsqcwvqeyamzfakqjgsqmosbhc ; /usr/bin/python3'
Dec 03 20:48:18 compute-0 sudo[30500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:19 compute-0 python3[30502]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:48:19 compute-0 sudo[30500]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:19 compute-0 sudo[30573]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixxgqecewzjdafcuopkuhspkbannnuus ; /usr/bin/python3'
Dec 03 20:48:19 compute-0 sudo[30573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:19 compute-0 python3[30575]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764794896.1794226-33662-64347837682086/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:48:19 compute-0 sudo[30573]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:19 compute-0 sudo[30599]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xunvwsrpmvvaehzdnyxftqbdpfhqfcuo ; /usr/bin/python3'
Dec 03 20:48:19 compute-0 sudo[30599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:19 compute-0 python3[30601]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:48:19 compute-0 sudo[30599]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:19 compute-0 sudo[30672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjmwzmbjnktxbykjuxklxakpqhszduqq ; /usr/bin/python3'
Dec 03 20:48:19 compute-0 sudo[30672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:20 compute-0 python3[30674]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764794896.1794226-33662-64347837682086/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:48:20 compute-0 sudo[30672]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:20 compute-0 sudo[30698]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bavxlurfrayxpidaanrlaafmlzqywufx ; /usr/bin/python3'
Dec 03 20:48:20 compute-0 sudo[30698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:20 compute-0 python3[30700]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 20:48:20 compute-0 sudo[30698]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:20 compute-0 sudo[30771]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avtpvtlkmlefcfhdiuhheqdjejnezjlw ; /usr/bin/python3'
Dec 03 20:48:20 compute-0 sudo[30771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:48:20 compute-0 python3[30773]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764794896.1794226-33662-64347837682086/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:48:20 compute-0 sudo[30771]: pam_unix(sudo:session): session closed for user root
Dec 03 20:48:23 compute-0 sshd-session[30798]: Connection closed by 192.168.122.11 port 40890 [preauth]
Dec 03 20:48:23 compute-0 sshd-session[30799]: Connection closed by 192.168.122.11 port 40898 [preauth]
Dec 03 20:48:23 compute-0 sshd-session[30800]: Unable to negotiate with 192.168.122.11 port 40914: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 03 20:48:23 compute-0 sshd-session[30801]: Unable to negotiate with 192.168.122.11 port 40918: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 03 20:48:23 compute-0 sshd-session[30802]: Unable to negotiate with 192.168.122.11 port 40924: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 03 20:48:32 compute-0 python3[30831]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:49:38 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 03 20:49:38 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 03 20:49:38 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 03 20:49:38 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 03 20:53:31 compute-0 sshd-session[29917]: Received disconnect from 38.102.83.47 port 44370:11: disconnected by user
Dec 03 20:53:31 compute-0 sshd-session[29917]: Disconnected from user zuul 38.102.83.47 port 44370
Dec 03 20:53:31 compute-0 sshd-session[29914]: pam_unix(sshd:session): session closed for user zuul
Dec 03 20:53:31 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Dec 03 20:53:31 compute-0 systemd[1]: session-6.scope: Consumed 5.490s CPU time.
Dec 03 20:53:31 compute-0 systemd-logind[787]: Session 6 logged out. Waiting for processes to exit.
Dec 03 20:53:31 compute-0 systemd-logind[787]: Removed session 6.
Dec 03 20:57:38 compute-0 systemd[1]: Starting dnf makecache...
Dec 03 20:57:38 compute-0 dnf[30838]: Failed determining last makecache time.
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-openstack-barbican-42b4c41831408a8e323 287 kB/s |  13 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 1.9 MB/s |  65 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.2 MB/s |  32 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-python-stevedore-c4acc5639fd2329372142 5.3 MB/s | 131 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.5 MB/s |  32 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-os-net-config-d0cedbdb788d43e5c7551df5  13 MB/s | 349 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 2.0 MB/s |  42 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-python-designate-tests-tempest-347fdbc 884 kB/s |  18 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-openstack-glance-1fd12c29b339f30fe823e 916 kB/s |  18 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.5 MB/s |  29 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-openstack-manila-3c01b7181572c95dac462 1.3 MB/s |  25 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-python-whitebox-neutron-tests-tempest- 6.0 MB/s | 154 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-openstack-octavia-ba397f07a7331190208c 1.3 MB/s |  26 kB     00:00
Dec 03 20:57:38 compute-0 dnf[30838]: delorean-openstack-watcher-c014f81a8647287f6dcc 791 kB/s |  16 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: delorean-ansible-config_template-5ccaa22121a7ff 350 kB/s | 7.4 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 6.5 MB/s | 144 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: delorean-openstack-swift-dc98a8463506ac520c469a 667 kB/s |  14 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: delorean-python-tempestconf-8515371b7cceebd4282 2.7 MB/s |  53 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.4 MB/s |  96 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: CentOS Stream 9 - BaseOS                         62 kB/s | 6.4 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: CentOS Stream 9 - AppStream                      28 kB/s | 6.5 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: CentOS Stream 9 - CRB                            74 kB/s | 6.3 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: CentOS Stream 9 - Extras packages                77 kB/s | 8.3 kB     00:00
Dec 03 20:57:39 compute-0 dnf[30838]: dlrn-antelope-testing                            31 MB/s | 1.1 MB     00:00
Dec 03 20:57:40 compute-0 dnf[30838]: dlrn-antelope-build-deps                         17 MB/s | 461 kB     00:00
Dec 03 20:57:40 compute-0 dnf[30838]: centos9-rabbitmq                                7.5 MB/s | 123 kB     00:00
Dec 03 20:57:40 compute-0 dnf[30838]: centos9-storage                                  24 MB/s | 415 kB     00:00
Dec 03 20:57:40 compute-0 dnf[30838]: centos9-opstools                                4.3 MB/s |  51 kB     00:00
Dec 03 20:57:40 compute-0 dnf[30838]: NFV SIG OpenvSwitch                              20 MB/s | 456 kB     00:00
Dec 03 20:57:41 compute-0 dnf[30838]: repo-setup-centos-appstream                      87 MB/s |  25 MB     00:00
Dec 03 20:57:47 compute-0 dnf[30838]: repo-setup-centos-baseos                         77 MB/s | 8.8 MB     00:00
Dec 03 20:57:48 compute-0 dnf[30838]: repo-setup-centos-highavailability               33 MB/s | 744 kB     00:00
Dec 03 20:57:48 compute-0 dnf[30838]: repo-setup-centos-powertools                     80 MB/s | 7.3 MB     00:00
Dec 03 20:57:51 compute-0 dnf[30838]: Extra Packages for Enterprise Linux 9 - x86_64   17 MB/s |  20 MB     00:01
Dec 03 20:58:04 compute-0 dnf[30838]: Metadata cache created.
Dec 03 20:58:04 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 03 20:58:04 compute-0 systemd[1]: Finished dnf makecache.
Dec 03 20:58:04 compute-0 systemd[1]: dnf-makecache.service: Consumed 24.404s CPU time.
Dec 03 20:59:06 compute-0 sshd-session[30941]: Accepted publickey for zuul from 192.168.122.30 port 38828 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 20:59:06 compute-0 systemd-logind[787]: New session 7 of user zuul.
Dec 03 20:59:06 compute-0 systemd[1]: Started Session 7 of User zuul.
Dec 03 20:59:06 compute-0 sshd-session[30941]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 20:59:07 compute-0 python3.9[31094]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:59:08 compute-0 sudo[31273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwiekbcwpkkaslkkzhgivtboashqoovs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795548.2265768-32-179223785271119/AnsiballZ_command.py'
Dec 03 20:59:08 compute-0 sudo[31273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:08 compute-0 python3.9[31275]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:59:16 compute-0 sudo[31273]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:16 compute-0 sshd-session[30944]: Connection closed by 192.168.122.30 port 38828
Dec 03 20:59:16 compute-0 sshd-session[30941]: pam_unix(sshd:session): session closed for user zuul
Dec 03 20:59:16 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Dec 03 20:59:16 compute-0 systemd[1]: session-7.scope: Consumed 8.480s CPU time.
Dec 03 20:59:16 compute-0 systemd-logind[787]: Session 7 logged out. Waiting for processes to exit.
Dec 03 20:59:16 compute-0 systemd-logind[787]: Removed session 7.
Dec 03 20:59:32 compute-0 sshd-session[31333]: Accepted publickey for zuul from 192.168.122.30 port 60590 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 20:59:32 compute-0 systemd-logind[787]: New session 8 of user zuul.
Dec 03 20:59:32 compute-0 systemd[1]: Started Session 8 of User zuul.
Dec 03 20:59:32 compute-0 sshd-session[31333]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 20:59:33 compute-0 python3.9[31486]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 03 20:59:34 compute-0 python3.9[31660]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:59:35 compute-0 sudo[31810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfgrkuffkcmjyrfhqfybulrtywqhvolt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795574.6580744-45-163984415791464/AnsiballZ_command.py'
Dec 03 20:59:35 compute-0 sudo[31810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:35 compute-0 python3.9[31812]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 20:59:35 compute-0 sudo[31810]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:36 compute-0 sudo[31963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqvaupqljtqehesqctwgtfbcdgrqqgcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795575.6649773-57-102834537482419/AnsiballZ_stat.py'
Dec 03 20:59:36 compute-0 sudo[31963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:36 compute-0 python3.9[31965]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 20:59:36 compute-0 sudo[31963]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:36 compute-0 sudo[32115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foqxoakhkejhnwheqrggkwktjogqrxpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795576.5334158-65-160859373870634/AnsiballZ_file.py'
Dec 03 20:59:36 compute-0 sudo[32115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:37 compute-0 python3.9[32117]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:59:37 compute-0 sudo[32115]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:37 compute-0 sudo[32267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylvwwxembxzoxpiampzyauedadnbvdza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795577.3647454-73-231380558920240/AnsiballZ_stat.py'
Dec 03 20:59:37 compute-0 sudo[32267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:38 compute-0 python3.9[32269]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 20:59:38 compute-0 sudo[32267]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:38 compute-0 sudo[32390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddvcdlcptvcvvoalsgoksyexjmbbeohc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795577.3647454-73-231380558920240/AnsiballZ_copy.py'
Dec 03 20:59:38 compute-0 sudo[32390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:38 compute-0 python3.9[32392]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795577.3647454-73-231380558920240/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:59:38 compute-0 sudo[32390]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:39 compute-0 sudo[32542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twebudffmswtzcwpkctfeeibjkqrrcsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795578.9599972-88-173144617530134/AnsiballZ_setup.py'
Dec 03 20:59:39 compute-0 sudo[32542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:39 compute-0 python3.9[32544]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:59:39 compute-0 sudo[32542]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:40 compute-0 sudo[32698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekretfywplprbxoeolyxkdmbvxvatokx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795579.955584-96-90045481350026/AnsiballZ_file.py'
Dec 03 20:59:40 compute-0 sudo[32698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:40 compute-0 python3.9[32700]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 20:59:40 compute-0 sudo[32698]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:41 compute-0 sudo[32850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvhkuresxkxnjzupfzcswnctruectvzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795580.7219098-105-107131756413951/AnsiballZ_file.py'
Dec 03 20:59:41 compute-0 sudo[32850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:41 compute-0 python3.9[32852]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 20:59:41 compute-0 sudo[32850]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:42 compute-0 python3.9[33002]: ansible-ansible.builtin.service_facts Invoked
Dec 03 20:59:47 compute-0 python3.9[33255]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 20:59:48 compute-0 python3.9[33405]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:59:49 compute-0 python3.9[33559]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 20:59:50 compute-0 sudo[33715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccomvltndtvjqagwuuwxgnckljyeeheq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795589.7550848-153-140077152701753/AnsiballZ_setup.py'
Dec 03 20:59:50 compute-0 sudo[33715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:50 compute-0 python3.9[33717]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 20:59:50 compute-0 sudo[33715]: pam_unix(sudo:session): session closed for user root
Dec 03 20:59:51 compute-0 sudo[33799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpavlwmttnziletqfbawhihznirgdaam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795589.7550848-153-140077152701753/AnsiballZ_dnf.py'
Dec 03 20:59:51 compute-0 sudo[33799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 20:59:51 compute-0 python3.9[33801]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:00:34 compute-0 systemd[1]: Reloading.
Dec 03 21:00:34 compute-0 systemd-rc-local-generator[34000]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:00:34 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 03 21:00:35 compute-0 systemd[1]: Reloading.
Dec 03 21:00:35 compute-0 systemd-rc-local-generator[34032]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:00:35 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 03 21:00:35 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 03 21:00:35 compute-0 systemd[1]: Reloading.
Dec 03 21:00:35 compute-0 systemd-rc-local-generator[34077]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:00:35 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 03 21:00:36 compute-0 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Dec 03 21:00:36 compute-0 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Dec 03 21:00:36 compute-0 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Dec 03 21:00:50 compute-0 sshd-session[34138]: Received disconnect from 80.94.93.233 port 58708:11:  [preauth]
Dec 03 21:00:50 compute-0 sshd-session[34138]: Disconnected from authenticating user root 80.94.93.233 port 58708 [preauth]
Dec 03 21:01:01 compute-0 CROND[34181]: (root) CMD (run-parts /etc/cron.hourly)
Dec 03 21:01:01 compute-0 run-parts[34184]: (/etc/cron.hourly) starting 0anacron
Dec 03 21:01:01 compute-0 anacron[34192]: Anacron started on 2025-12-03
Dec 03 21:01:01 compute-0 anacron[34192]: Will run job `cron.daily' in 17 min.
Dec 03 21:01:01 compute-0 anacron[34192]: Will run job `cron.weekly' in 37 min.
Dec 03 21:01:01 compute-0 anacron[34192]: Will run job `cron.monthly' in 57 min.
Dec 03 21:01:01 compute-0 anacron[34192]: Jobs will be executed sequentially
Dec 03 21:01:01 compute-0 run-parts[34194]: (/etc/cron.hourly) finished 0anacron
Dec 03 21:01:01 compute-0 CROND[34180]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 03 21:01:41 compute-0 kernel: SELinux:  Converting 2719 SID table entries...
Dec 03 21:01:41 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 21:01:41 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 03 21:01:41 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 21:01:41 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 03 21:01:41 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 21:01:41 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 21:01:41 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 21:01:41 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 03 21:01:42 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 21:01:42 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 03 21:01:42 compute-0 systemd[1]: Reloading.
Dec 03 21:01:42 compute-0 systemd-rc-local-generator[34417]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:01:42 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 03 21:01:42 compute-0 sudo[33799]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:43 compute-0 sudo[35330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfgdpqhkwohkfzwwfhzwqlpgiasipxin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795702.987242-165-265228414441128/AnsiballZ_command.py'
Dec 03 21:01:43 compute-0 sudo[35330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:43 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 21:01:43 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 03 21:01:43 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.530s CPU time.
Dec 03 21:01:43 compute-0 systemd[1]: run-rf47e1ca3264549e2adb42401c31b8e1b.service: Deactivated successfully.
Dec 03 21:01:43 compute-0 python3.9[35339]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:01:44 compute-0 sudo[35330]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:45 compute-0 sudo[35620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcdxxzqxxjzzyxdiwivxyehhwarffwtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795704.6912308-173-155186222020957/AnsiballZ_selinux.py'
Dec 03 21:01:45 compute-0 sudo[35620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:45 compute-0 python3.9[35622]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 03 21:01:45 compute-0 sudo[35620]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:46 compute-0 sudo[35772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkxcuovecnhiiwifqowgsyxwdqaoigcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795706.1268048-184-225247385575427/AnsiballZ_command.py'
Dec 03 21:01:46 compute-0 sudo[35772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:46 compute-0 python3.9[35774]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 03 21:01:47 compute-0 sudo[35772]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:48 compute-0 sudo[35925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dohtugejribasflnwacfwqeyxrvitfgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795707.912209-192-61146465212359/AnsiballZ_file.py'
Dec 03 21:01:48 compute-0 sudo[35925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:49 compute-0 python3.9[35927]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:01:49 compute-0 sudo[35925]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:50 compute-0 sudo[36077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foxtoqrnupjakjcpmcfgvfxvdwoqdess ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795709.5228567-200-266707164551191/AnsiballZ_mount.py'
Dec 03 21:01:50 compute-0 sudo[36077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:50 compute-0 python3.9[36079]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 03 21:01:50 compute-0 sudo[36077]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:51 compute-0 sudo[36229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwcfuhkyivxcvhqxdwoqcuygnzmpzjaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795711.1460688-228-188478177119733/AnsiballZ_file.py'
Dec 03 21:01:51 compute-0 sudo[36229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:51 compute-0 python3.9[36231]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:01:51 compute-0 sudo[36229]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:52 compute-0 sudo[36381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpkifygnfxqiisazldcmznsgwzvxekqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795712.016875-236-108060662745938/AnsiballZ_stat.py'
Dec 03 21:01:52 compute-0 sudo[36381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:52 compute-0 python3.9[36383]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:01:52 compute-0 sudo[36381]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:53 compute-0 sudo[36504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baettryoocfrqtreqvrftksypzsfsefv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795712.016875-236-108060662745938/AnsiballZ_copy.py'
Dec 03 21:01:53 compute-0 sudo[36504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:53 compute-0 python3.9[36506]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795712.016875-236-108060662745938/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b6113f6c3c3112d11c0348cd0a11619cc2e5f10c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:01:53 compute-0 sudo[36504]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:54 compute-0 sudo[36656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yackieariorikawittgfauwjdndmnpov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795713.8346658-260-238959622500880/AnsiballZ_stat.py'
Dec 03 21:01:54 compute-0 sudo[36656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:54 compute-0 python3.9[36658]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:01:54 compute-0 sudo[36656]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:54 compute-0 sudo[36808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udlbpqvlmaooyjgovqisxigoubcnqaxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795714.6236298-268-19458754104683/AnsiballZ_command.py'
Dec 03 21:01:54 compute-0 sudo[36808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:55 compute-0 python3.9[36810]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:01:55 compute-0 sudo[36808]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:55 compute-0 sudo[36961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elnanclfpjecxdcaexpcdffqmhamdjlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795715.4407792-276-247117759958489/AnsiballZ_file.py'
Dec 03 21:01:55 compute-0 sudo[36961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:01:56 compute-0 python3.9[36963]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:01:56 compute-0 sudo[36961]: pam_unix(sudo:session): session closed for user root
Dec 03 21:01:56 compute-0 sudo[37113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyypxrtbrwrzckgpryfnxewhybdvssln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795716.3998616-287-156889724611038/AnsiballZ_getent.py'
Dec 03 21:01:56 compute-0 sudo[37113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:01 compute-0 python3.9[37115]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 03 21:02:01 compute-0 sudo[37113]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:01 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:02:02 compute-0 sudo[37267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqxaniyibpiduieslevlkjikwswgzodr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795722.0720184-295-120668849031664/AnsiballZ_group.py'
Dec 03 21:02:02 compute-0 sudo[37267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:02 compute-0 python3.9[37269]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 03 21:02:02 compute-0 groupadd[37270]: group added to /etc/group: name=qemu, GID=107
Dec 03 21:02:02 compute-0 groupadd[37270]: group added to /etc/gshadow: name=qemu
Dec 03 21:02:02 compute-0 groupadd[37270]: new group: name=qemu, GID=107
Dec 03 21:02:02 compute-0 sudo[37267]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:03 compute-0 sudo[37425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rljihfyhwytwybbjejcrqhtbijecnmqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795723.0356746-303-66420758106868/AnsiballZ_user.py'
Dec 03 21:02:03 compute-0 sudo[37425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:03 compute-0 python3.9[37427]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 03 21:02:03 compute-0 useradd[37429]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 03 21:02:03 compute-0 sudo[37425]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:04 compute-0 sudo[37585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnijmkgpidudhdtjwnxzjtaogfqrvndj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795724.0346544-311-134366711797052/AnsiballZ_getent.py'
Dec 03 21:02:04 compute-0 sudo[37585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:04 compute-0 python3.9[37587]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 03 21:02:04 compute-0 sudo[37585]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:05 compute-0 sudo[37738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgxodtdoelcxxotmfduddbbfwljtjtzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795724.902117-319-86768076224850/AnsiballZ_group.py'
Dec 03 21:02:05 compute-0 sudo[37738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:05 compute-0 python3.9[37740]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 03 21:02:05 compute-0 groupadd[37741]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 03 21:02:05 compute-0 groupadd[37741]: group added to /etc/gshadow: name=hugetlbfs
Dec 03 21:02:05 compute-0 groupadd[37741]: new group: name=hugetlbfs, GID=42477
Dec 03 21:02:05 compute-0 sudo[37738]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:06 compute-0 sudo[37896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbfoncuzcvrpkfegymaagsovtifhroxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795725.8311148-328-128092803394762/AnsiballZ_file.py'
Dec 03 21:02:06 compute-0 sudo[37896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:06 compute-0 python3.9[37898]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 03 21:02:06 compute-0 sudo[37896]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:07 compute-0 sudo[38048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvtalfwzhokkrfauguyltigniutgrola ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795726.7525752-339-201592041810645/AnsiballZ_dnf.py'
Dec 03 21:02:07 compute-0 sudo[38048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:07 compute-0 python3.9[38050]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:02:08 compute-0 sudo[38048]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:09 compute-0 sudo[38201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daawxnlztcnijsfknjhxjxfdyeribjkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795729.0565522-347-171562525208741/AnsiballZ_file.py'
Dec 03 21:02:09 compute-0 sudo[38201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:09 compute-0 python3.9[38203]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:02:09 compute-0 sudo[38201]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:10 compute-0 sudo[38353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhgyjuniqnyberbcrpyjnvdrncjkmovm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795729.7922132-355-193119444306434/AnsiballZ_stat.py'
Dec 03 21:02:10 compute-0 sudo[38353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:10 compute-0 python3.9[38355]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:02:10 compute-0 sudo[38353]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:10 compute-0 sudo[38476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnfraeciwwwbrpaalirzxwkxwfgtzeos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795729.7922132-355-193119444306434/AnsiballZ_copy.py'
Dec 03 21:02:10 compute-0 sudo[38476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:11 compute-0 python3.9[38478]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764795729.7922132-355-193119444306434/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:02:11 compute-0 sudo[38476]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:11 compute-0 sudo[38628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaavmxhzfhzwovucsvhkvlrlxheecbek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795731.2734544-370-1665450894310/AnsiballZ_systemd.py'
Dec 03 21:02:11 compute-0 sudo[38628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:12 compute-0 python3.9[38630]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:02:12 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 03 21:02:12 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 03 21:02:12 compute-0 kernel: Bridge firewalling registered
Dec 03 21:02:12 compute-0 systemd-modules-load[38634]: Inserted module 'br_netfilter'
Dec 03 21:02:12 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 03 21:02:12 compute-0 sudo[38628]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:12 compute-0 sudo[38787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsgtrhxnxtofgupvycrhbxxjxhrsbffd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795732.521198-378-50133161386334/AnsiballZ_stat.py'
Dec 03 21:02:12 compute-0 sudo[38787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:13 compute-0 python3.9[38789]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:02:13 compute-0 sudo[38787]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:13 compute-0 sudo[38910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqrlpryruxyzqemnhzucgbtezktyyzmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795732.521198-378-50133161386334/AnsiballZ_copy.py'
Dec 03 21:02:13 compute-0 sudo[38910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:13 compute-0 python3.9[38912]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764795732.521198-378-50133161386334/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:02:13 compute-0 sudo[38910]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:14 compute-0 sudo[39062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biashkmhdoaljndxscfzuvbixinbrzsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795734.240972-396-194842982187473/AnsiballZ_dnf.py'
Dec 03 21:02:14 compute-0 sudo[39062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:14 compute-0 python3.9[39064]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:02:18 compute-0 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Dec 03 21:02:18 compute-0 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Dec 03 21:02:18 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 21:02:18 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 03 21:02:18 compute-0 systemd[1]: Reloading.
Dec 03 21:02:18 compute-0 systemd-rc-local-generator[39123]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:02:18 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 03 21:02:19 compute-0 sudo[39062]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:20 compute-0 python3.9[40382]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:02:20 compute-0 python3.9[41282]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 03 21:02:21 compute-0 python3.9[42028]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:02:22 compute-0 sudo[42893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pftpwmwttsxamcbxnybeivugomgegbvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795741.950426-435-93496311935512/AnsiballZ_command.py'
Dec 03 21:02:22 compute-0 sudo[42893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:22 compute-0 python3.9[42912]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:02:22 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 03 21:02:22 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 21:02:22 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 03 21:02:22 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.367s CPU time.
Dec 03 21:02:22 compute-0 systemd[1]: run-rb11c4417196d4ddea1d5e4d1102a5dae.service: Deactivated successfully.
Dec 03 21:02:22 compute-0 systemd[1]: Starting Authorization Manager...
Dec 03 21:02:22 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 03 21:02:23 compute-0 polkitd[43470]: Started polkitd version 0.117
Dec 03 21:02:23 compute-0 polkitd[43470]: Loading rules from directory /etc/polkit-1/rules.d
Dec 03 21:02:23 compute-0 polkitd[43470]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 03 21:02:23 compute-0 polkitd[43470]: Finished loading, compiling and executing 2 rules
Dec 03 21:02:23 compute-0 systemd[1]: Started Authorization Manager.
Dec 03 21:02:23 compute-0 polkitd[43470]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 03 21:02:23 compute-0 sudo[42893]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:23 compute-0 sudo[43638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qshgcxwvpzgfrtactepskvuxbrpwzegt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795743.3634868-444-246040629579566/AnsiballZ_systemd.py'
Dec 03 21:02:23 compute-0 sudo[43638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:24 compute-0 python3.9[43640]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:02:24 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 03 21:02:24 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 03 21:02:24 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 03 21:02:24 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 03 21:02:24 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 03 21:02:24 compute-0 sudo[43638]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:25 compute-0 python3.9[43802]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 03 21:02:27 compute-0 sudo[43952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teqoqhjwxryitcgwhmsvmdmkglhznrib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795746.713558-501-246717928241248/AnsiballZ_systemd.py'
Dec 03 21:02:27 compute-0 sudo[43952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:27 compute-0 python3.9[43954]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:02:27 compute-0 systemd[1]: Reloading.
Dec 03 21:02:27 compute-0 systemd-rc-local-generator[43982]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:02:27 compute-0 sudo[43952]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:28 compute-0 sudo[44142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruttjghslgsktpmdeacfhqgpkzwsqnqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795747.7850125-501-42904216422731/AnsiballZ_systemd.py'
Dec 03 21:02:28 compute-0 sudo[44142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:28 compute-0 python3.9[44144]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:02:28 compute-0 systemd[1]: Reloading.
Dec 03 21:02:28 compute-0 systemd-rc-local-generator[44174]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:02:28 compute-0 sudo[44142]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:29 compute-0 sudo[44331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyzfsnymlygbrfmtgfscbzbcpedzdnkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795749.001709-517-244619824608248/AnsiballZ_command.py'
Dec 03 21:02:29 compute-0 sudo[44331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:29 compute-0 python3.9[44333]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:02:29 compute-0 sudo[44331]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:30 compute-0 sudo[44484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdnthfnanirezquepvquiyewqtbhptsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795749.8367934-525-2413960886269/AnsiballZ_command.py'
Dec 03 21:02:30 compute-0 sudo[44484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:30 compute-0 python3.9[44486]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:02:30 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 03 21:02:30 compute-0 sudo[44484]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:31 compute-0 sudo[44639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjteutubpghhqlclhdtoijoysoqyaggq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795750.6634753-533-88157839709584/AnsiballZ_command.py'
Dec 03 21:02:31 compute-0 sudo[44639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:31 compute-0 python3.9[44641]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:02:32 compute-0 sudo[44639]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:33 compute-0 sudo[44801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbasnskiualcnkjcylzvxtuoapexwjjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795752.9742742-541-139847917113722/AnsiballZ_command.py'
Dec 03 21:02:33 compute-0 sudo[44801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:33 compute-0 python3.9[44803]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:02:33 compute-0 sudo[44801]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:34 compute-0 sudo[44954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgblkevnzqiiiaogiizvghktolksjkbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795753.6603794-549-26482312346476/AnsiballZ_systemd.py'
Dec 03 21:02:34 compute-0 sudo[44954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:34 compute-0 python3.9[44956]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:02:34 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 03 21:02:34 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Dec 03 21:02:34 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Dec 03 21:02:34 compute-0 systemd[1]: Starting Apply Kernel Variables...
Dec 03 21:02:34 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 03 21:02:34 compute-0 systemd[1]: Finished Apply Kernel Variables.
Dec 03 21:02:34 compute-0 sudo[44954]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:34 compute-0 sshd-session[31336]: Connection closed by 192.168.122.30 port 60590
Dec 03 21:02:34 compute-0 sshd-session[31333]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:02:34 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Dec 03 21:02:34 compute-0 systemd[1]: session-8.scope: Consumed 2min 20.167s CPU time.
Dec 03 21:02:34 compute-0 systemd-logind[787]: Session 8 logged out. Waiting for processes to exit.
Dec 03 21:02:34 compute-0 systemd-logind[787]: Removed session 8.
Dec 03 21:02:35 compute-0 sshd-session[44487]: Invalid user admin from 45.140.17.124 port 48538
Dec 03 21:02:35 compute-0 sshd-session[44487]: Connection reset by invalid user admin 45.140.17.124 port 48538 [preauth]
Dec 03 21:02:37 compute-0 sshd-session[44986]: Invalid user admin from 45.140.17.124 port 48550
Dec 03 21:02:38 compute-0 sshd-session[44986]: Connection reset by invalid user admin 45.140.17.124 port 48550 [preauth]
Dec 03 21:02:40 compute-0 sshd-session[44990]: Accepted publickey for zuul from 192.168.122.30 port 42026 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:02:40 compute-0 systemd-logind[787]: New session 9 of user zuul.
Dec 03 21:02:40 compute-0 systemd[1]: Started Session 9 of User zuul.
Dec 03 21:02:40 compute-0 sshd-session[44990]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:02:40 compute-0 sshd-session[44988]: Connection reset by authenticating user root 45.140.17.124 port 48562 [preauth]
Dec 03 21:02:41 compute-0 python3.9[45145]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:02:42 compute-0 sudo[45299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvdjkiyyqletgauljmlksklbubeckrap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795762.0320404-36-256562309487158/AnsiballZ_getent.py'
Dec 03 21:02:42 compute-0 sudo[45299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:42 compute-0 python3.9[45301]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 03 21:02:42 compute-0 sudo[45299]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:42 compute-0 sshd-session[45046]: Invalid user test from 45.140.17.124 port 28472
Dec 03 21:02:43 compute-0 sshd-session[45046]: Connection reset by invalid user test 45.140.17.124 port 28472 [preauth]
Dec 03 21:02:43 compute-0 sudo[45452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehcupmuxodlqimyfyheqvjsomihrmkwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795762.8564746-44-87561721572698/AnsiballZ_group.py'
Dec 03 21:02:43 compute-0 sudo[45452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:43 compute-0 python3.9[45454]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 03 21:02:43 compute-0 groupadd[45456]: group added to /etc/group: name=openvswitch, GID=42476
Dec 03 21:02:43 compute-0 groupadd[45456]: group added to /etc/gshadow: name=openvswitch
Dec 03 21:02:43 compute-0 groupadd[45456]: new group: name=openvswitch, GID=42476
Dec 03 21:02:43 compute-0 sudo[45452]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:44 compute-0 sudo[45612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwsjbvwmwjszaxiuklitjaigpghqjalg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795763.8509448-52-125498798287931/AnsiballZ_user.py'
Dec 03 21:02:44 compute-0 sudo[45612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:44 compute-0 python3.9[45614]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 03 21:02:44 compute-0 useradd[45616]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 03 21:02:44 compute-0 useradd[45616]: add 'openvswitch' to group 'hugetlbfs'
Dec 03 21:02:44 compute-0 useradd[45616]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 03 21:02:44 compute-0 sudo[45612]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:45 compute-0 sudo[45772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxsybiitrqhruixvyoyklbxiqhvnvqgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795765.063315-62-200898404579206/AnsiballZ_setup.py'
Dec 03 21:02:45 compute-0 sudo[45772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:45 compute-0 python3.9[45774]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:02:45 compute-0 sshd-session[45455]: Connection reset by authenticating user root 45.140.17.124 port 28486 [preauth]
Dec 03 21:02:45 compute-0 sudo[45772]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:46 compute-0 sudo[45856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxbotocrxhvgyiidhkebqddqgmbdupgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795765.063315-62-200898404579206/AnsiballZ_dnf.py'
Dec 03 21:02:46 compute-0 sudo[45856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:46 compute-0 python3.9[45858]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 03 21:02:48 compute-0 sudo[45856]: pam_unix(sudo:session): session closed for user root
Dec 03 21:02:49 compute-0 sudo[46019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhbwnjfiosmaytnixkupxqdcoqwwkwhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795769.0370185-76-10998163627497/AnsiballZ_dnf.py'
Dec 03 21:02:49 compute-0 sudo[46019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:02:49 compute-0 python3.9[46021]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:03:00 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Dec 03 21:03:00 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 21:03:00 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 03 21:03:00 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 21:03:00 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 03 21:03:00 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 21:03:00 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 21:03:00 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 21:03:00 compute-0 groupadd[46044]: group added to /etc/group: name=unbound, GID=993
Dec 03 21:03:00 compute-0 groupadd[46044]: group added to /etc/gshadow: name=unbound
Dec 03 21:03:00 compute-0 groupadd[46044]: new group: name=unbound, GID=993
Dec 03 21:03:00 compute-0 useradd[46051]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 03 21:03:01 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 03 21:03:01 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 03 21:03:02 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 21:03:02 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 03 21:03:02 compute-0 systemd[1]: Reloading.
Dec 03 21:03:02 compute-0 systemd-rc-local-generator[46550]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:03:02 compute-0 systemd-sysv-generator[46553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:03:03 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 03 21:03:03 compute-0 sudo[46019]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 21:03:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 03 21:03:03 compute-0 systemd[1]: run-r2904b5ae85c94a51bf39cda9d8f3f11b.service: Deactivated successfully.
Dec 03 21:03:04 compute-0 sudo[47118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmxvhscdwdjamdcputpiiaeqivahxeqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795783.896554-84-248959654025714/AnsiballZ_systemd.py'
Dec 03 21:03:04 compute-0 sudo[47118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:04 compute-0 python3.9[47120]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:03:04 compute-0 systemd[1]: Reloading.
Dec 03 21:03:05 compute-0 systemd-rc-local-generator[47149]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:03:05 compute-0 systemd-sysv-generator[47154]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:03:05 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Dec 03 21:03:05 compute-0 chown[47162]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 03 21:03:05 compute-0 ovs-ctl[47167]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 03 21:03:05 compute-0 ovs-ctl[47167]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 03 21:03:05 compute-0 ovs-ctl[47167]: Starting ovsdb-server [  OK  ]
Dec 03 21:03:05 compute-0 ovs-vsctl[47217]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 03 21:03:05 compute-0 ovs-vsctl[47233]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"f27c01e7-5b62-4209-a664-3ae50b74644d\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 03 21:03:05 compute-0 ovs-ctl[47167]: Configuring Open vSwitch system IDs [  OK  ]
Dec 03 21:03:05 compute-0 ovs-vsctl[47242]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 03 21:03:05 compute-0 ovs-ctl[47167]: Enabling remote OVSDB managers [  OK  ]
Dec 03 21:03:05 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Dec 03 21:03:05 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 03 21:03:05 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 03 21:03:05 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 03 21:03:05 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Dec 03 21:03:05 compute-0 ovs-ctl[47288]: Inserting openvswitch module [  OK  ]
Dec 03 21:03:05 compute-0 ovs-ctl[47256]: Starting ovs-vswitchd [  OK  ]
Dec 03 21:03:05 compute-0 ovs-vsctl[47305]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 03 21:03:05 compute-0 ovs-ctl[47256]: Enabling remote OVSDB managers [  OK  ]
Dec 03 21:03:05 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 03 21:03:05 compute-0 systemd[1]: Starting Open vSwitch...
Dec 03 21:03:05 compute-0 systemd[1]: Finished Open vSwitch.
Dec 03 21:03:06 compute-0 sudo[47118]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:06 compute-0 python3.9[47457]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:03:07 compute-0 sudo[47607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvkbdtjlwxulyqohndnynstaimyvrvlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795787.0840585-102-9544508857984/AnsiballZ_sefcontext.py'
Dec 03 21:03:07 compute-0 sudo[47607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:07 compute-0 python3.9[47609]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 03 21:03:08 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Dec 03 21:03:09 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 21:03:09 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 03 21:03:09 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 21:03:09 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 03 21:03:09 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 21:03:09 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 21:03:09 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 21:03:09 compute-0 sudo[47607]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:09 compute-0 python3.9[47765]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:03:10 compute-0 sudo[47921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebcdzieifrfdzqtlhdqboitflgxrkghe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795790.4516528-120-45592102627543/AnsiballZ_dnf.py'
Dec 03 21:03:10 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 03 21:03:10 compute-0 sudo[47921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:10 compute-0 python3.9[47923]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:03:12 compute-0 sudo[47921]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:12 compute-0 sudo[48074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzslpsvwruzpijqbehodaukmjbgynxzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795792.3403988-128-3303282203435/AnsiballZ_command.py'
Dec 03 21:03:12 compute-0 sudo[48074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:12 compute-0 python3.9[48076]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:03:13 compute-0 sudo[48074]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:15 compute-0 sudo[48361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxzkbulnycgywzgqzbrtaumjjormgmuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795794.8926153-136-107576842236254/AnsiballZ_file.py'
Dec 03 21:03:15 compute-0 sudo[48361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:15 compute-0 python3.9[48363]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 03 21:03:15 compute-0 sudo[48361]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:16 compute-0 python3.9[48513]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:03:16 compute-0 sudo[48665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkjlvnxttldoeykjssobzlgoyuzhqriw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795796.58019-152-229255304310485/AnsiballZ_dnf.py'
Dec 03 21:03:16 compute-0 sudo[48665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:17 compute-0 python3.9[48667]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:03:19 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 21:03:19 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 03 21:03:19 compute-0 systemd[1]: Reloading.
Dec 03 21:03:19 compute-0 systemd-rc-local-generator[48705]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:03:19 compute-0 systemd-sysv-generator[48709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:03:19 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 03 21:03:20 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 21:03:20 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 03 21:03:20 compute-0 systemd[1]: run-re08e8272296d4b86958b9aa36390917e.service: Deactivated successfully.
Dec 03 21:03:20 compute-0 sudo[48665]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:20 compute-0 sudo[48983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aucxkyipgelogjzfcjwmqqygmywdxdsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795800.2480469-160-46443488484294/AnsiballZ_systemd.py'
Dec 03 21:03:20 compute-0 sudo[48983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:20 compute-0 python3.9[48985]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:03:20 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 03 21:03:20 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Dec 03 21:03:20 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Dec 03 21:03:20 compute-0 systemd[1]: Stopping Network Manager...
Dec 03 21:03:20 compute-0 NetworkManager[7187]: <info>  [1764795800.9402] caught SIGTERM, shutting down normally.
Dec 03 21:03:20 compute-0 NetworkManager[7187]: <info>  [1764795800.9422] dhcp4 (eth0): canceled DHCP transaction
Dec 03 21:03:20 compute-0 NetworkManager[7187]: <info>  [1764795800.9422] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 03 21:03:20 compute-0 NetworkManager[7187]: <info>  [1764795800.9422] dhcp4 (eth0): state changed no lease
Dec 03 21:03:20 compute-0 NetworkManager[7187]: <info>  [1764795800.9426] manager: NetworkManager state is now CONNECTED_SITE
Dec 03 21:03:20 compute-0 NetworkManager[7187]: <info>  [1764795800.9499] exiting (success)
Dec 03 21:03:20 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 03 21:03:20 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 03 21:03:20 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 03 21:03:20 compute-0 systemd[1]: Stopped Network Manager.
Dec 03 21:03:20 compute-0 systemd[1]: NetworkManager.service: Consumed 11.162s CPU time, 4.1M memory peak, read 0B from disk, written 35.5K to disk.
Dec 03 21:03:20 compute-0 systemd[1]: Starting Network Manager...
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.0285] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:cb12512a-3aa8-4735-9c82-f409c246c155)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.0286] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.0353] manager[0x5653f81e0090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 03 21:03:21 compute-0 systemd[1]: Starting Hostname Service...
Dec 03 21:03:21 compute-0 systemd[1]: Started Hostname Service.
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1578] hostname: hostname: using hostnamed
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1579] hostname: static hostname changed from (none) to "compute-0"
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1588] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1598] manager[0x5653f81e0090]: rfkill: Wi-Fi hardware radio set enabled
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1598] manager[0x5653f81e0090]: rfkill: WWAN hardware radio set enabled
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1639] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1655] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1656] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1658] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1659] manager: Networking is enabled by state file
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1663] settings: Loaded settings plugin: keyfile (internal)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1669] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1711] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1725] dhcp: init: Using DHCP client 'internal'
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1730] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1738] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1749] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1763] device (lo): Activation: starting connection 'lo' (2dc71c9a-e258-42ef-b117-38009802277f)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1775] device (eth0): carrier: link connected
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1782] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1790] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1791] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1801] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1811] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1820] device (eth1): carrier: link connected
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1828] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1835] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (82caecc5-1713-50fd-827f-a8910de7f4a3) (indicated)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1836] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1844] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1855] device (eth1): Activation: starting connection 'ci-private-network' (82caecc5-1713-50fd-827f-a8910de7f4a3)
Dec 03 21:03:21 compute-0 systemd[1]: Started Network Manager.
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1865] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1879] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1883] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1887] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1891] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1896] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1900] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1903] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1909] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1921] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1926] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1940] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1962] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1979] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1986] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1990] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.1998] device (lo): Activation: successful, device activated.
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2021] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 03 21:03:21 compute-0 systemd[1]: Starting Network Manager Wait Online...
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2239] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2248] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2251] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2255] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2261] device (eth1): Activation: successful, device activated.
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2312] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2314] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2319] manager: NetworkManager state is now CONNECTED_SITE
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2325] device (eth0): Activation: successful, device activated.
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2334] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 03 21:03:21 compute-0 NetworkManager[48996]: <info>  [1764795801.2383] manager: startup complete
Dec 03 21:03:21 compute-0 systemd[1]: Finished Network Manager Wait Online.
Dec 03 21:03:21 compute-0 sudo[48983]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:21 compute-0 sudo[49210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsszrcihlavlfxrwynkwihqcwvdkritf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795801.4580595-168-239231187816037/AnsiballZ_dnf.py'
Dec 03 21:03:21 compute-0 sudo[49210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:21 compute-0 python3.9[49212]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:03:26 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 21:03:26 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 03 21:03:26 compute-0 systemd[1]: Reloading.
Dec 03 21:03:26 compute-0 systemd-rc-local-generator[49264]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:03:26 compute-0 systemd-sysv-generator[49267]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:03:26 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 03 21:03:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 21:03:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 03 21:03:27 compute-0 systemd[1]: run-rd7a8bc2f491843aea0b63b3d825837d2.service: Deactivated successfully.
Dec 03 21:03:27 compute-0 sudo[49210]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:28 compute-0 sudo[49669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebgkdyfkmmavgzydnvumzwxjaywcnkyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795808.2244182-180-245990621487564/AnsiballZ_stat.py'
Dec 03 21:03:28 compute-0 sudo[49669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:28 compute-0 python3.9[49671]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:03:28 compute-0 sudo[49669]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:29 compute-0 sudo[49821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fewlgurmptyxggymnebffbtxbkdmpjxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795808.897541-189-77866164640658/AnsiballZ_ini_file.py'
Dec 03 21:03:29 compute-0 sudo[49821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:29 compute-0 python3.9[49823]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:29 compute-0 sudo[49821]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:30 compute-0 sudo[49975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyvexzxwjjdpctcfnpcxlyefbogerjgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795809.7550316-199-74578732672427/AnsiballZ_ini_file.py'
Dec 03 21:03:30 compute-0 sudo[49975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:30 compute-0 python3.9[49977]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:30 compute-0 sudo[49975]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:30 compute-0 sudo[50127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbornxzdbdpfeazgaxvfgvhfmshmyhnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795810.439126-199-232743611444526/AnsiballZ_ini_file.py'
Dec 03 21:03:30 compute-0 sudo[50127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:30 compute-0 python3.9[50129]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:30 compute-0 sudo[50127]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:31 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 03 21:03:31 compute-0 sudo[50279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkvuzjlgtikyfwafrcchvhvroatbmedc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795811.1860425-214-170809388366810/AnsiballZ_ini_file.py'
Dec 03 21:03:31 compute-0 sudo[50279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:31 compute-0 python3.9[50281]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:31 compute-0 sudo[50279]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:32 compute-0 sudo[50431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnbpnocqubyenhasxkmnoaausgpitpmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795811.933822-214-143816169420785/AnsiballZ_ini_file.py'
Dec 03 21:03:32 compute-0 sudo[50431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:32 compute-0 python3.9[50433]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:32 compute-0 sudo[50431]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:32 compute-0 sudo[50583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxfzvbrdnkvsgzfmrwmklkkrfsmjofe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795812.5533903-229-207182494871959/AnsiballZ_stat.py'
Dec 03 21:03:32 compute-0 sudo[50583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:32 compute-0 python3.9[50585]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:03:33 compute-0 sudo[50583]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:33 compute-0 sudo[50706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mudbbpqqojuhuuvbklzgfyamosxqkaye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795812.5533903-229-207182494871959/AnsiballZ_copy.py'
Dec 03 21:03:33 compute-0 sudo[50706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:33 compute-0 python3.9[50708]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795812.5533903-229-207182494871959/.source _original_basename=.a_cf4f79 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:33 compute-0 sudo[50706]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:34 compute-0 sudo[50858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qttkpfzsbkhtysjqnwokzfyqdtckzqye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795813.853085-244-161099074913911/AnsiballZ_file.py'
Dec 03 21:03:34 compute-0 sudo[50858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:34 compute-0 python3.9[50860]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:34 compute-0 sudo[50858]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:34 compute-0 sudo[51010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hktsukslbttenvzfunarhnvtsustdvna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795814.4530144-252-124504481578945/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 03 21:03:34 compute-0 sudo[51010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:35 compute-0 python3.9[51012]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 03 21:03:35 compute-0 sudo[51010]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:35 compute-0 sudo[51162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hybffpimehoimpuroabnbwyerwkuoudp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795815.3334744-261-109960529958271/AnsiballZ_file.py'
Dec 03 21:03:35 compute-0 sudo[51162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:35 compute-0 python3.9[51164]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:35 compute-0 sudo[51162]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:36 compute-0 sudo[51314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxysmcofhrwrkpdrrgtnyketjosxhrks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795816.1600268-271-155880688405364/AnsiballZ_stat.py'
Dec 03 21:03:36 compute-0 sudo[51314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:36 compute-0 sudo[51314]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:36 compute-0 sudo[51437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymktqkgvgtsbdreziwkekhycstprfqab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795816.1600268-271-155880688405364/AnsiballZ_copy.py'
Dec 03 21:03:36 compute-0 sudo[51437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:37 compute-0 sudo[51437]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:37 compute-0 sudo[51589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idjvumueojvmqnabhiqtnczqrdwqxcwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795817.4499156-286-127803605776703/AnsiballZ_slurp.py'
Dec 03 21:03:37 compute-0 sudo[51589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:38 compute-0 python3.9[51591]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 03 21:03:38 compute-0 sudo[51589]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:38 compute-0 sudo[51764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijwodvolregcbynmoqueamymlssetaqz ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795818.2231066-295-252145472369764/async_wrapper.py j632954729244 300 /home/zuul/.ansible/tmp/ansible-tmp-1764795818.2231066-295-252145472369764/AnsiballZ_edpm_os_net_config.py _'
Dec 03 21:03:38 compute-0 sudo[51764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:39 compute-0 ansible-async_wrapper.py[51766]: Invoked with j632954729244 300 /home/zuul/.ansible/tmp/ansible-tmp-1764795818.2231066-295-252145472369764/AnsiballZ_edpm_os_net_config.py _
Dec 03 21:03:39 compute-0 ansible-async_wrapper.py[51769]: Starting module and watcher
Dec 03 21:03:39 compute-0 ansible-async_wrapper.py[51769]: Start watching 51770 (300)
Dec 03 21:03:39 compute-0 ansible-async_wrapper.py[51770]: Start module (51770)
Dec 03 21:03:39 compute-0 ansible-async_wrapper.py[51766]: Return async_wrapper task started.
Dec 03 21:03:39 compute-0 sudo[51764]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:39 compute-0 python3.9[51771]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 03 21:03:39 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 03 21:03:39 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 03 21:03:39 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 03 21:03:39 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 03 21:03:39 compute-0 kernel: cfg80211: failed to load regulatory.db
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.0655] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.0673] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1162] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1163] audit: op="connection-add" uuid="8c167eb1-5903-4c8a-9b7a-1f7180590fe2" name="br-ex-br" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1177] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1179] audit: op="connection-add" uuid="3c63e537-d723-49f2-8a13-2ec654a09dab" name="br-ex-port" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1191] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1201] audit: op="connection-add" uuid="edbf8d0d-f777-4722-b65c-923d33f3c339" name="eth1-port" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1211] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1212] audit: op="connection-add" uuid="f7a9ac4a-94ea-4a62-a8b4-d1c718c840d2" name="vlan20-port" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1222] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1224] audit: op="connection-add" uuid="2733bc75-1892-48e8-8bdf-7fb6bb8c6608" name="vlan21-port" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1234] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1236] audit: op="connection-add" uuid="ceacbaec-a9b5-48f3-8f85-bbdc1fa6a57b" name="vlan22-port" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1246] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1247] audit: op="connection-add" uuid="b2408982-c017-4a7a-9d60-128e2abc7a05" name="vlan23-port" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1266] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1280] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1282] audit: op="connection-add" uuid="9434fda8-8961-4e83-8d0a-2530a2533efc" name="br-ex-if" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1347] audit: op="connection-update" uuid="82caecc5-1713-50fd-827f-a8910de7f4a3" name="ci-private-network" args="ipv4.dns,ipv4.addresses,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ovs-interface.type,ipv6.addr-gen-mode,ipv6.addresses,ipv6.method,ipv6.dns,ipv6.routes,ipv6.routing-rules,connection.controller,connection.master,connection.port-type,connection.timestamp,connection.slave-type,ovs-external-ids.data" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1362] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1364] audit: op="connection-add" uuid="8fb07113-a063-480b-985d-a6f24fbd45d4" name="vlan20-if" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1378] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1379] audit: op="connection-add" uuid="971d684f-fc1b-4aff-916f-5b92f8e6d9a9" name="vlan21-if" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1393] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1394] audit: op="connection-add" uuid="4e702b00-3924-42dc-8632-35b8cbca1464" name="vlan22-if" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1408] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1409] audit: op="connection-add" uuid="f611a32b-045a-40ae-aafd-5036a92f60e4" name="vlan23-if" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1421] audit: op="connection-delete" uuid="3fbb02b6-2141-3242-b016-afde93023b71" name="Wired connection 1" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1431] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1440] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1444] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (8c167eb1-5903-4c8a-9b7a-1f7180590fe2)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1444] audit: op="connection-activate" uuid="8c167eb1-5903-4c8a-9b7a-1f7180590fe2" name="br-ex-br" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1446] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1452] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1456] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (3c63e537-d723-49f2-8a13-2ec654a09dab)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1458] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1463] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1467] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (edbf8d0d-f777-4722-b65c-923d33f3c339)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1468] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1474] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1478] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (f7a9ac4a-94ea-4a62-a8b4-d1c718c840d2)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1480] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1486] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1489] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (2733bc75-1892-48e8-8bdf-7fb6bb8c6608)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1490] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1497] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1500] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ceacbaec-a9b5-48f3-8f85-bbdc1fa6a57b)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1502] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1508] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1511] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (b2408982-c017-4a7a-9d60-128e2abc7a05)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1512] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1514] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1516] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1522] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1526] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1530] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (9434fda8-8961-4e83-8d0a-2530a2533efc)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1531] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1534] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1536] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1537] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1538] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1547] device (eth1): disconnecting for new activation request.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1548] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1550] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1552] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1552] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1555] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1559] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1563] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (8fb07113-a063-480b-985d-a6f24fbd45d4)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1564] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1567] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1569] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1571] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1573] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1579] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1584] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (971d684f-fc1b-4aff-916f-5b92f8e6d9a9)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1585] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1589] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1591] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1592] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1596] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1601] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1605] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (4e702b00-3924-42dc-8632-35b8cbca1464)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1606] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1609] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1611] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1613] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1616] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1620] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1624] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (f611a32b-045a-40ae-aafd-5036a92f60e4)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1625] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1628] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1630] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1631] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1632] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1643] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.autoconnect-priority" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1645] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1648] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1650] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1656] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1659] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1663] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1665] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1667] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1672] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 kernel: ovs-system: entered promiscuous mode
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1677] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1680] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1682] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1687] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1690] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1693] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1695] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 systemd-udevd[51777]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 21:03:41 compute-0 kernel: Timeout policy base is empty
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1700] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1704] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1707] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1709] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1714] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1718] dhcp4 (eth0): canceled DHCP transaction
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1719] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1719] dhcp4 (eth0): state changed no lease
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1720] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1731] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1735] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51772 uid=0 result="fail" reason="Device is not activated"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1739] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 03 21:03:41 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1773] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1781] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1785] dhcp4 (eth0): state changed new lease, address=38.102.83.219
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1790] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1830] device (eth1): disconnecting for new activation request.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1831] audit: op="connection-activate" uuid="82caecc5-1713-50fd-827f-a8910de7f4a3" name="ci-private-network" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.1916] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2021] device (eth1): Activation: starting connection 'ci-private-network' (82caecc5-1713-50fd-827f-a8910de7f4a3)
Dec 03 21:03:41 compute-0 kernel: br-ex: entered promiscuous mode
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2026] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2027] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51772 uid=0 result="success"
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2028] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2028] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2029] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2030] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2032] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2033] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2037] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2039] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2042] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2045] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2048] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2051] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2054] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2058] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2061] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2063] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2066] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2070] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2073] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2077] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2080] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2084] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2091] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2102] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 03 21:03:41 compute-0 kernel: vlan22: entered promiscuous mode
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2106] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 systemd-udevd[51776]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 21:03:41 compute-0 kernel: vlan23: entered promiscuous mode
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2176] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2178] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 systemd-udevd[51778]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2182] device (eth1): Activation: successful, device activated.
Dec 03 21:03:41 compute-0 kernel: vlan20: entered promiscuous mode
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2269] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2274] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2282] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2299] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2306] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2324] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 kernel: vlan21: entered promiscuous mode
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2333] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2334] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2339] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2344] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2350] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2355] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2394] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2395] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2402] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2409] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2425] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2458] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2459] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2461] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2465] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2479] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2523] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2524] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 03 21:03:41 compute-0 NetworkManager[48996]: <info>  [1764795821.2528] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 03 21:03:42 compute-0 NetworkManager[48996]: <info>  [1764795822.3667] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51772 uid=0 result="success"
Dec 03 21:03:42 compute-0 NetworkManager[48996]: <info>  [1764795822.5879] checkpoint[0x5653f81b5950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 03 21:03:42 compute-0 NetworkManager[48996]: <info>  [1764795822.5881] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51772 uid=0 result="success"
Dec 03 21:03:42 compute-0 sudo[52129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfadwbelukzdhjshdsqpxizmhevxzccf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795822.2027197-295-82323156771094/AnsiballZ_async_status.py'
Dec 03 21:03:42 compute-0 sudo[52129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:42 compute-0 python3.9[52131]: ansible-ansible.legacy.async_status Invoked with jid=j632954729244.51766 mode=status _async_dir=/root/.ansible_async
Dec 03 21:03:42 compute-0 NetworkManager[48996]: <info>  [1764795822.8789] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51772 uid=0 result="success"
Dec 03 21:03:42 compute-0 NetworkManager[48996]: <info>  [1764795822.8803] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51772 uid=0 result="success"
Dec 03 21:03:42 compute-0 sudo[52129]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:43 compute-0 NetworkManager[48996]: <info>  [1764795823.0991] audit: op="networking-control" arg="global-dns-configuration" pid=51772 uid=0 result="success"
Dec 03 21:03:43 compute-0 NetworkManager[48996]: <info>  [1764795823.3062] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 03 21:03:43 compute-0 NetworkManager[48996]: <info>  [1764795823.3694] audit: op="networking-control" arg="global-dns-configuration" pid=51772 uid=0 result="success"
Dec 03 21:03:43 compute-0 NetworkManager[48996]: <info>  [1764795823.4122] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51772 uid=0 result="success"
Dec 03 21:03:43 compute-0 NetworkManager[48996]: <info>  [1764795823.6216] checkpoint[0x5653f81b5a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 03 21:03:43 compute-0 NetworkManager[48996]: <info>  [1764795823.6222] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51772 uid=0 result="success"
Dec 03 21:03:43 compute-0 ansible-async_wrapper.py[51770]: Module complete (51770)
Dec 03 21:03:44 compute-0 ansible-async_wrapper.py[51769]: Done in kid B.
Dec 03 21:03:46 compute-0 sudo[52234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sraygjqjcvaudbxvhsqhhmskrpculftu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795822.2027197-295-82323156771094/AnsiballZ_async_status.py'
Dec 03 21:03:46 compute-0 sudo[52234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:46 compute-0 python3.9[52236]: ansible-ansible.legacy.async_status Invoked with jid=j632954729244.51766 mode=status _async_dir=/root/.ansible_async
Dec 03 21:03:46 compute-0 sudo[52234]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:46 compute-0 sudo[52334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxhzzuuixzhpeikdpceshwxmnkbcbawn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795822.2027197-295-82323156771094/AnsiballZ_async_status.py'
Dec 03 21:03:46 compute-0 sudo[52334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:46 compute-0 python3.9[52336]: ansible-ansible.legacy.async_status Invoked with jid=j632954729244.51766 mode=cleanup _async_dir=/root/.ansible_async
Dec 03 21:03:46 compute-0 sudo[52334]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:47 compute-0 sudo[52486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecuqmauwcayluzpzwwwekcdeinuvufct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795827.0623922-322-128559723055906/AnsiballZ_stat.py'
Dec 03 21:03:47 compute-0 sudo[52486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:47 compute-0 python3.9[52488]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:03:47 compute-0 sudo[52486]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:47 compute-0 sudo[52609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgtjpnrsupukamhzmeahqzupvatbxhdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795827.0623922-322-128559723055906/AnsiballZ_copy.py'
Dec 03 21:03:47 compute-0 sudo[52609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:48 compute-0 python3.9[52611]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795827.0623922-322-128559723055906/.source.returncode _original_basename=.6b49ecuh follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:48 compute-0 sudo[52609]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:48 compute-0 sudo[52761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqsjywfarudvpuapumwlnodxueyzvzkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795828.4498706-338-12156648532113/AnsiballZ_stat.py'
Dec 03 21:03:48 compute-0 sudo[52761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:48 compute-0 python3.9[52763]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:03:48 compute-0 sudo[52761]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:49 compute-0 sudo[52884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsyzorjynnbxoisbjycbzzkqojbsuhvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795828.4498706-338-12156648532113/AnsiballZ_copy.py'
Dec 03 21:03:49 compute-0 sudo[52884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:49 compute-0 python3.9[52886]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795828.4498706-338-12156648532113/.source.cfg _original_basename=.7avz18o5 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:03:49 compute-0 sudo[52884]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:49 compute-0 sudo[53037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lviiwehahdvuyulqdoydsxqukvexxzue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795829.626482-353-190878137136737/AnsiballZ_systemd.py'
Dec 03 21:03:49 compute-0 sudo[53037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:03:50 compute-0 python3.9[53039]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:03:50 compute-0 systemd[1]: Reloading Network Manager...
Dec 03 21:03:50 compute-0 NetworkManager[48996]: <info>  [1764795830.3900] audit: op="reload" arg="0" pid=53043 uid=0 result="success"
Dec 03 21:03:50 compute-0 NetworkManager[48996]: <info>  [1764795830.3909] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 03 21:03:50 compute-0 systemd[1]: Reloaded Network Manager.
Dec 03 21:03:50 compute-0 sudo[53037]: pam_unix(sudo:session): session closed for user root
Dec 03 21:03:50 compute-0 sshd-session[44993]: Connection closed by 192.168.122.30 port 42026
Dec 03 21:03:50 compute-0 sshd-session[44990]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:03:50 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Dec 03 21:03:50 compute-0 systemd[1]: session-9.scope: Consumed 49.984s CPU time.
Dec 03 21:03:50 compute-0 systemd-logind[787]: Session 9 logged out. Waiting for processes to exit.
Dec 03 21:03:50 compute-0 systemd-logind[787]: Removed session 9.
Dec 03 21:03:51 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 03 21:03:55 compute-0 sshd-session[53076]: Accepted publickey for zuul from 192.168.122.30 port 35210 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:03:56 compute-0 systemd-logind[787]: New session 10 of user zuul.
Dec 03 21:03:56 compute-0 systemd[1]: Started Session 10 of User zuul.
Dec 03 21:03:56 compute-0 sshd-session[53076]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:03:57 compute-0 python3.9[53229]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:03:58 compute-0 python3.9[53383]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:03:59 compute-0 python3.9[53576]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:03:59 compute-0 sshd-session[53079]: Connection closed by 192.168.122.30 port 35210
Dec 03 21:03:59 compute-0 sshd-session[53076]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:03:59 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Dec 03 21:03:59 compute-0 systemd[1]: session-10.scope: Consumed 2.391s CPU time.
Dec 03 21:03:59 compute-0 systemd-logind[787]: Session 10 logged out. Waiting for processes to exit.
Dec 03 21:03:59 compute-0 systemd-logind[787]: Removed session 10.
Dec 03 21:04:00 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 03 21:04:07 compute-0 sshd-session[53606]: Accepted publickey for zuul from 192.168.122.30 port 39328 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:04:07 compute-0 systemd-logind[787]: New session 11 of user zuul.
Dec 03 21:04:07 compute-0 systemd[1]: Started Session 11 of User zuul.
Dec 03 21:04:07 compute-0 sshd-session[53606]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:04:08 compute-0 python3.9[53759]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:04:09 compute-0 python3.9[53914]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:04:10 compute-0 sudo[54068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjwnbfabzefqhtkmvacuiheqdxedsuwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795849.9810152-40-247083968137102/AnsiballZ_setup.py'
Dec 03 21:04:10 compute-0 sudo[54068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:10 compute-0 python3.9[54070]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:04:10 compute-0 sudo[54068]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:11 compute-0 sudo[54153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtemmeooujruknqbrdpyvegvjtczaosx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795849.9810152-40-247083968137102/AnsiballZ_dnf.py'
Dec 03 21:04:11 compute-0 sudo[54153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:11 compute-0 sshd-session[54079]: error: kex_exchange_identification: read: Connection reset by peer
Dec 03 21:04:11 compute-0 sshd-session[54079]: Connection reset by 93.219.6.79 port 55428
Dec 03 21:04:11 compute-0 python3.9[54155]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:04:12 compute-0 sudo[54153]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:13 compute-0 sudo[54309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjdlpdrvjyvltlebcbacbbocsnnaykim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795853.1371264-52-192408027219358/AnsiballZ_setup.py'
Dec 03 21:04:13 compute-0 sudo[54309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:13 compute-0 python3.9[54311]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:04:14 compute-0 sudo[54309]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:14 compute-0 sudo[54504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzomlvjoazdqfjcxkzvksiieetjtuywj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795854.2318206-63-180347264647649/AnsiballZ_file.py'
Dec 03 21:04:14 compute-0 sudo[54504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:14 compute-0 python3.9[54506]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:04:14 compute-0 sudo[54504]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:15 compute-0 sudo[54656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooqaajltlnrjdnzvhdbewqjvzqyyjvfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795855.1200838-71-46157405806269/AnsiballZ_command.py'
Dec 03 21:04:15 compute-0 sudo[54656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:15 compute-0 python3.9[54658]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:04:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1766208549-merged.mount: Deactivated successfully.
Dec 03 21:04:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1198909937-merged.mount: Deactivated successfully.
Dec 03 21:04:15 compute-0 podman[54659]: 2025-12-03 21:04:15.9432519 +0000 UTC m=+0.073472754 system refresh
Dec 03 21:04:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:04:15 compute-0 sudo[54656]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:16 compute-0 sudo[54819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbwipbnyexedukqmlvgpaujgsgnmykpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795856.179343-79-84528925027707/AnsiballZ_stat.py'
Dec 03 21:04:16 compute-0 sudo[54819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:16 compute-0 python3.9[54821]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:04:16 compute-0 sudo[54819]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:17 compute-0 sudo[54942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nefhlawpefdqgntfucjklrrulyysohie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795856.179343-79-84528925027707/AnsiballZ_copy.py'
Dec 03 21:04:17 compute-0 sudo[54942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:17 compute-0 python3.9[54944]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795856.179343-79-84528925027707/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9bc2e1a602f29a322097501b442d6abeaac1c740 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:04:17 compute-0 sudo[54942]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:18 compute-0 sudo[55094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyugdldztclxznhmgnekjcuxcsmceaxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795857.8341827-94-207874164122947/AnsiballZ_stat.py'
Dec 03 21:04:18 compute-0 sudo[55094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:18 compute-0 python3.9[55096]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:04:18 compute-0 sudo[55094]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:18 compute-0 sudo[55217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amhwkoqbvxxvsqsbltykpsnqjtsshhby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795857.8341827-94-207874164122947/AnsiballZ_copy.py'
Dec 03 21:04:18 compute-0 sudo[55217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:18 compute-0 python3.9[55219]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764795857.8341827-94-207874164122947/.source.conf follow=False _original_basename=registries.conf.j2 checksum=8c73fbc0d7cddf5b89d40cde842a385025fa8102 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:04:18 compute-0 sudo[55217]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:19 compute-0 sudo[55369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnnspqowxufqhhtjhfoqstystwqiwxrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795859.0885453-110-94534415386552/AnsiballZ_ini_file.py'
Dec 03 21:04:19 compute-0 sudo[55369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:19 compute-0 python3.9[55371]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:04:19 compute-0 sudo[55369]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:20 compute-0 sudo[55521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfwvdbcgzbyzctpqzpayttsjppbqjxfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795859.8674529-110-268677603691547/AnsiballZ_ini_file.py'
Dec 03 21:04:20 compute-0 sudo[55521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:20 compute-0 python3.9[55523]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:04:20 compute-0 sudo[55521]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:20 compute-0 sudo[55673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dinciuyuycxteeqfrztkgsbdqtkuztkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795860.6223514-110-92938667192400/AnsiballZ_ini_file.py'
Dec 03 21:04:20 compute-0 sudo[55673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:21 compute-0 python3.9[55675]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:04:21 compute-0 sudo[55673]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:21 compute-0 sudo[55825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqhzhbiitilnauxvaeucuwjthurufbig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795861.223898-110-127139563041276/AnsiballZ_ini_file.py'
Dec 03 21:04:21 compute-0 sudo[55825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:21 compute-0 python3.9[55827]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:04:21 compute-0 sudo[55825]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:22 compute-0 sudo[55977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwlqaqqszvuoefsqaayisrxpedebzyvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795862.214525-141-264465570790384/AnsiballZ_dnf.py'
Dec 03 21:04:22 compute-0 sudo[55977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:22 compute-0 python3.9[55979]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:04:24 compute-0 sudo[55977]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:24 compute-0 sudo[56130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srccxqnwklqyimnjbkuyjsdcmtynmbhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795864.4677904-152-9598286280898/AnsiballZ_setup.py'
Dec 03 21:04:24 compute-0 sudo[56130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:25 compute-0 python3.9[56132]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:04:25 compute-0 sudo[56130]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:25 compute-0 sudo[56284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbsiltvxrswgssrzrsruixngugfwdzuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795865.3196204-160-60875015970584/AnsiballZ_stat.py'
Dec 03 21:04:25 compute-0 sudo[56284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:25 compute-0 python3.9[56286]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:04:25 compute-0 sudo[56284]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:26 compute-0 sudo[56436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckgpqyfizqbcidvcrbgtdoofappwapgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795866.1019568-169-69056840229096/AnsiballZ_stat.py'
Dec 03 21:04:26 compute-0 sudo[56436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:26 compute-0 python3.9[56438]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:04:26 compute-0 sudo[56436]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:27 compute-0 sudo[56588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvekiasxhxfekdezcwpgodcgnmttgdpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795866.9087522-179-160164975170653/AnsiballZ_command.py'
Dec 03 21:04:27 compute-0 sudo[56588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:27 compute-0 python3.9[56590]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:04:27 compute-0 sudo[56588]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:28 compute-0 sudo[56741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvggjopezevcgrvouhhasjbbjeunggjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795867.738907-189-24695945941615/AnsiballZ_service_facts.py'
Dec 03 21:04:28 compute-0 sudo[56741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:28 compute-0 python3.9[56743]: ansible-service_facts Invoked
Dec 03 21:04:28 compute-0 network[56760]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:04:28 compute-0 network[56761]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:04:28 compute-0 network[56762]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:04:31 compute-0 sshd-session[54157]: Invalid user a from 93.219.6.79 port 57664
Dec 03 21:04:32 compute-0 sudo[56741]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:33 compute-0 sshd-session[54157]: Connection closed by invalid user a 93.219.6.79 port 57664 [preauth]
Dec 03 21:04:33 compute-0 sudo[57045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aphhsidftpsyhyfnilzcdwdafrjobipr ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764795873.2012553-204-155857944412521/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764795873.2012553-204-155857944412521/args'
Dec 03 21:04:33 compute-0 sudo[57045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:33 compute-0 sudo[57045]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:34 compute-0 sudo[57212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aapluyrzgvmbgkeoqwcyoxegbgirzykd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795873.9151192-215-108004443661972/AnsiballZ_dnf.py'
Dec 03 21:04:34 compute-0 sudo[57212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:34 compute-0 python3.9[57214]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:04:35 compute-0 sudo[57212]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:36 compute-0 sudo[57365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcqxxucxxxeflkrnpzhwlmegchxrkkes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795876.118356-228-102607500718606/AnsiballZ_package_facts.py'
Dec 03 21:04:36 compute-0 sudo[57365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:37 compute-0 python3.9[57367]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 03 21:04:37 compute-0 sudo[57365]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:38 compute-0 sudo[57517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvvxnyjcpvzgtritiqzedoooozzqchla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795877.8040223-238-101484405873619/AnsiballZ_stat.py'
Dec 03 21:04:38 compute-0 sudo[57517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:38 compute-0 python3.9[57519]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:04:38 compute-0 sudo[57517]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:38 compute-0 sudo[57642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjltscrdyszyjrwepudfpcpjgeaxjrit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795877.8040223-238-101484405873619/AnsiballZ_copy.py'
Dec 03 21:04:38 compute-0 sudo[57642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:38 compute-0 python3.9[57644]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795877.8040223-238-101484405873619/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:04:38 compute-0 sudo[57642]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:39 compute-0 sudo[57796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpgtdiwctglwsyvhdfajckskmbwgzisj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795879.1879485-253-231961925731425/AnsiballZ_stat.py'
Dec 03 21:04:39 compute-0 sudo[57796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:39 compute-0 python3.9[57798]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:04:39 compute-0 sudo[57796]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:40 compute-0 sudo[57921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scfqlosewmxinehtbkrpgobxxpkirhgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795879.1879485-253-231961925731425/AnsiballZ_copy.py'
Dec 03 21:04:40 compute-0 sudo[57921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:40 compute-0 python3.9[57923]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795879.1879485-253-231961925731425/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:04:40 compute-0 sudo[57921]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:41 compute-0 sudo[58075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huxanpidbyuxfmpnzhnxhskiaiqkvxfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795880.8181007-274-255120236857444/AnsiballZ_lineinfile.py'
Dec 03 21:04:41 compute-0 sudo[58075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:41 compute-0 python3.9[58077]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:04:41 compute-0 sudo[58075]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:42 compute-0 sudo[58229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdldguioxejxfodplyqxdsreuluxkphm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795882.3591547-289-244604801205283/AnsiballZ_setup.py'
Dec 03 21:04:42 compute-0 sudo[58229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:42 compute-0 python3.9[58231]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:04:43 compute-0 sudo[58229]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:43 compute-0 sudo[58313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnbqhgoessxrevvllzbyrfhmrfwbqlte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795882.3591547-289-244604801205283/AnsiballZ_systemd.py'
Dec 03 21:04:43 compute-0 sudo[58313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:44 compute-0 python3.9[58315]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:04:44 compute-0 sudo[58313]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:44 compute-0 sudo[58467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujmhafpevdjsciwbyxogkkqnkpnaagkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795884.7190907-305-132948884764341/AnsiballZ_setup.py'
Dec 03 21:04:44 compute-0 sudo[58467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:45 compute-0 python3.9[58469]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:04:45 compute-0 sudo[58467]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:45 compute-0 sudo[58551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctvjobpfkfklsjoajcxadyvaiimwwnwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795884.7190907-305-132948884764341/AnsiballZ_systemd.py'
Dec 03 21:04:45 compute-0 sudo[58551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:46 compute-0 python3.9[58553]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:04:46 compute-0 chronyd[798]: chronyd exiting
Dec 03 21:04:46 compute-0 systemd[1]: Stopping NTP client/server...
Dec 03 21:04:46 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Dec 03 21:04:46 compute-0 systemd[1]: Stopped NTP client/server.
Dec 03 21:04:46 compute-0 systemd[1]: Starting NTP client/server...
Dec 03 21:04:46 compute-0 chronyd[58561]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 03 21:04:46 compute-0 chronyd[58561]: Frequency -26.368 +/- 0.290 ppm read from /var/lib/chrony/drift
Dec 03 21:04:46 compute-0 chronyd[58561]: Loaded seccomp filter (level 2)
Dec 03 21:04:46 compute-0 systemd[1]: Started NTP client/server.
Dec 03 21:04:46 compute-0 sudo[58551]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:46 compute-0 sshd-session[53609]: Connection closed by 192.168.122.30 port 39328
Dec 03 21:04:46 compute-0 sshd-session[53606]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:04:46 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Dec 03 21:04:46 compute-0 systemd[1]: session-11.scope: Consumed 27.574s CPU time.
Dec 03 21:04:46 compute-0 systemd-logind[787]: Session 11 logged out. Waiting for processes to exit.
Dec 03 21:04:46 compute-0 systemd-logind[787]: Removed session 11.
Dec 03 21:04:51 compute-0 sshd-session[58587]: Accepted publickey for zuul from 192.168.122.30 port 58722 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:04:51 compute-0 systemd-logind[787]: New session 12 of user zuul.
Dec 03 21:04:51 compute-0 systemd[1]: Started Session 12 of User zuul.
Dec 03 21:04:51 compute-0 sshd-session[58587]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:04:52 compute-0 sudo[58740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yraazvwilculsjurrfysldjxzpfdvgmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795891.9326334-22-151645400474945/AnsiballZ_file.py'
Dec 03 21:04:52 compute-0 sudo[58740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:52 compute-0 python3.9[58742]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:04:52 compute-0 sudo[58740]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:53 compute-0 sudo[58892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcugtybjcphgxklvxkekzcmumiknwsza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795892.9113667-34-41276866286450/AnsiballZ_stat.py'
Dec 03 21:04:53 compute-0 sudo[58892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:53 compute-0 python3.9[58894]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:04:53 compute-0 sudo[58892]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:54 compute-0 sudo[59015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-valqeuhinytwhzxmzdtbkqtqahephvyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795892.9113667-34-41276866286450/AnsiballZ_copy.py'
Dec 03 21:04:54 compute-0 sudo[59015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:04:54 compute-0 python3.9[59017]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795892.9113667-34-41276866286450/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:04:54 compute-0 sudo[59015]: pam_unix(sudo:session): session closed for user root
Dec 03 21:04:54 compute-0 sshd-session[58590]: Connection closed by 192.168.122.30 port 58722
Dec 03 21:04:54 compute-0 sshd-session[58587]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:04:54 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Dec 03 21:04:54 compute-0 systemd[1]: session-12.scope: Consumed 1.761s CPU time.
Dec 03 21:04:54 compute-0 systemd-logind[787]: Session 12 logged out. Waiting for processes to exit.
Dec 03 21:04:54 compute-0 systemd-logind[787]: Removed session 12.
Dec 03 21:05:01 compute-0 sshd-session[59042]: Accepted publickey for zuul from 192.168.122.30 port 42376 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:05:01 compute-0 systemd-logind[787]: New session 13 of user zuul.
Dec 03 21:05:01 compute-0 systemd[1]: Started Session 13 of User zuul.
Dec 03 21:05:01 compute-0 sshd-session[59042]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:05:02 compute-0 python3.9[59195]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:05:03 compute-0 sudo[59349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxqfyifqeacznvclqttkpvtrkpyuctqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795902.696487-33-166237250931441/AnsiballZ_file.py'
Dec 03 21:05:03 compute-0 sudo[59349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:03 compute-0 python3.9[59351]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:03 compute-0 sudo[59349]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:04 compute-0 sudo[59524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwytrpjdnjkaqlzuxqgcunelebqhfier ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795903.5440652-41-15210904089634/AnsiballZ_stat.py'
Dec 03 21:05:04 compute-0 sudo[59524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:04 compute-0 python3.9[59526]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:04 compute-0 sudo[59524]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:04 compute-0 sudo[59647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuqilzglpemdupehzpezghrvgfuvimrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795903.5440652-41-15210904089634/AnsiballZ_copy.py'
Dec 03 21:05:04 compute-0 sudo[59647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:05 compute-0 python3.9[59649]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764795903.5440652-41-15210904089634/.source.json _original_basename=.yirh0_ms follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:05 compute-0 sudo[59647]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:05 compute-0 sudo[59799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acawfeacxjjgvxdvgxbynxqizqblfrtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795905.4087315-64-118928620769666/AnsiballZ_stat.py'
Dec 03 21:05:05 compute-0 sudo[59799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:05 compute-0 python3.9[59801]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:05 compute-0 sudo[59799]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:06 compute-0 sudo[59922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oznfjbhlprtywysguquldhlzcfnfzmoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795905.4087315-64-118928620769666/AnsiballZ_copy.py'
Dec 03 21:05:06 compute-0 sudo[59922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:06 compute-0 python3.9[59924]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795905.4087315-64-118928620769666/.source _original_basename=.hgzoijt1 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:06 compute-0 sudo[59922]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:06 compute-0 sudo[60074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjoqhkzvrhtpsremkfpbfgigmmkqlywy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795906.6995642-80-111173809164986/AnsiballZ_file.py'
Dec 03 21:05:06 compute-0 sudo[60074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:07 compute-0 python3.9[60076]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:05:07 compute-0 sudo[60074]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:07 compute-0 sudo[60226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jehczxbvfukasqaidcksohuqlsaunlpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795907.3561344-88-133653242250833/AnsiballZ_stat.py'
Dec 03 21:05:07 compute-0 sudo[60226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:07 compute-0 python3.9[60228]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:07 compute-0 sudo[60226]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:08 compute-0 sudo[60349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thbpxaiealopcwjhbagsyepnutuoolqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795907.3561344-88-133653242250833/AnsiballZ_copy.py'
Dec 03 21:05:08 compute-0 sudo[60349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:08 compute-0 python3.9[60351]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764795907.3561344-88-133653242250833/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:05:08 compute-0 sudo[60349]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:08 compute-0 sudo[60501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frwwnizszpjojrnvvdmopjthqzfpgowf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795908.6217518-88-143673909519525/AnsiballZ_stat.py'
Dec 03 21:05:08 compute-0 sudo[60501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:09 compute-0 python3.9[60503]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:09 compute-0 sudo[60501]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:09 compute-0 sudo[60624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pebaexrylvrsxxiocavflqtjdcceugiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795908.6217518-88-143673909519525/AnsiballZ_copy.py'
Dec 03 21:05:09 compute-0 sudo[60624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:09 compute-0 python3.9[60626]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764795908.6217518-88-143673909519525/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:05:09 compute-0 sudo[60624]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:10 compute-0 sudo[60776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rojbbzxrgrmjsevnhqjsrvyfyydrgapf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795909.9436076-117-8906655677098/AnsiballZ_file.py'
Dec 03 21:05:10 compute-0 sudo[60776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:10 compute-0 python3.9[60778]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:10 compute-0 sudo[60776]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:11 compute-0 sudo[60928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uklhvzywxvrmzhpcwdleipoholkfmxpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795910.713527-125-40185827812188/AnsiballZ_stat.py'
Dec 03 21:05:11 compute-0 sudo[60928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:11 compute-0 python3.9[60930]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:11 compute-0 sudo[60928]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:11 compute-0 sudo[61051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzpbwgufzxtvfpkqsrbjzephduzcyliq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795910.713527-125-40185827812188/AnsiballZ_copy.py'
Dec 03 21:05:11 compute-0 sudo[61051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:11 compute-0 python3.9[61053]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795910.713527-125-40185827812188/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:11 compute-0 sudo[61051]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:12 compute-0 sudo[61203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlfymxhteztkvjanvuolkdgpeymebnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795912.173176-140-26140985187515/AnsiballZ_stat.py'
Dec 03 21:05:12 compute-0 sudo[61203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:12 compute-0 python3.9[61205]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:12 compute-0 sudo[61203]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:13 compute-0 sudo[61326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izeuxbdscbujurtnvxwweynbpssqxsae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795912.173176-140-26140985187515/AnsiballZ_copy.py'
Dec 03 21:05:13 compute-0 sudo[61326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:13 compute-0 python3.9[61328]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795912.173176-140-26140985187515/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:13 compute-0 sudo[61326]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:14 compute-0 sudo[61478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqixmmqwhqsbutnfvedmbgvlcxmrahmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795913.5816667-155-141997059258399/AnsiballZ_systemd.py'
Dec 03 21:05:14 compute-0 sudo[61478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:14 compute-0 python3.9[61480]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:05:14 compute-0 systemd[1]: Reloading.
Dec 03 21:05:14 compute-0 systemd-rc-local-generator[61507]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:05:14 compute-0 systemd-sysv-generator[61511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:05:14 compute-0 systemd[1]: Reloading.
Dec 03 21:05:14 compute-0 systemd-rc-local-generator[61547]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:05:14 compute-0 systemd-sysv-generator[61551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:05:15 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Dec 03 21:05:15 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Dec 03 21:05:15 compute-0 sudo[61478]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:15 compute-0 sudo[61707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuixvezjyqvmlfdtaxnoldxnurrquyuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795915.3151317-163-225910520478162/AnsiballZ_stat.py'
Dec 03 21:05:15 compute-0 sudo[61707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:15 compute-0 python3.9[61709]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:15 compute-0 sudo[61707]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:16 compute-0 sudo[61830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grurpuasjpndgfajgbfucuibofytdahq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795915.3151317-163-225910520478162/AnsiballZ_copy.py'
Dec 03 21:05:16 compute-0 sudo[61830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:16 compute-0 python3.9[61832]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795915.3151317-163-225910520478162/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:16 compute-0 sudo[61830]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:16 compute-0 sudo[61982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utnvfagpggqlpvymcoueqyhkqcdkunnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795916.6644263-178-142760279748827/AnsiballZ_stat.py'
Dec 03 21:05:16 compute-0 sudo[61982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:17 compute-0 python3.9[61984]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:17 compute-0 sudo[61982]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:17 compute-0 sudo[62105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjqydvfjorkttichtezuakrqlacpdgbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795916.6644263-178-142760279748827/AnsiballZ_copy.py'
Dec 03 21:05:17 compute-0 sudo[62105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:17 compute-0 python3.9[62107]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795916.6644263-178-142760279748827/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:17 compute-0 sudo[62105]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:18 compute-0 sudo[62257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfhzanvnbhuwabfckbglstkigfjwfzmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795917.9202635-193-7794908505686/AnsiballZ_systemd.py'
Dec 03 21:05:18 compute-0 sudo[62257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:18 compute-0 python3.9[62259]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:05:18 compute-0 systemd[1]: Reloading.
Dec 03 21:05:18 compute-0 systemd-rc-local-generator[62283]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:05:18 compute-0 systemd-sysv-generator[62288]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:05:18 compute-0 systemd[1]: Reloading.
Dec 03 21:05:18 compute-0 systemd-sysv-generator[62329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:05:18 compute-0 systemd-rc-local-generator[62323]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:05:19 compute-0 systemd[1]: Starting Create netns directory...
Dec 03 21:05:19 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 03 21:05:19 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 03 21:05:19 compute-0 systemd[1]: Finished Create netns directory.
Dec 03 21:05:19 compute-0 sudo[62257]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:20 compute-0 python3.9[62485]: ansible-ansible.builtin.service_facts Invoked
Dec 03 21:05:20 compute-0 network[62502]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:05:20 compute-0 network[62503]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:05:20 compute-0 network[62504]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:05:22 compute-0 sshd-session[62510]: Invalid user admin from 45.135.232.92 port 33318
Dec 03 21:05:22 compute-0 sshd-session[62510]: Connection reset by invalid user admin 45.135.232.92 port 33318 [preauth]
Dec 03 21:05:24 compute-0 sudo[62768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oafwbguvurhgdnnipshnlbzjhapecomr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795923.7159443-209-47253301196687/AnsiballZ_systemd.py'
Dec 03 21:05:24 compute-0 sudo[62768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:24 compute-0 sshd-session[62586]: Invalid user admin from 45.135.232.92 port 33328
Dec 03 21:05:24 compute-0 python3.9[62770]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:05:24 compute-0 systemd[1]: Reloading.
Dec 03 21:05:24 compute-0 systemd-rc-local-generator[62800]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:05:24 compute-0 systemd-sysv-generator[62804]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:05:24 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 03 21:05:24 compute-0 sshd-session[62586]: Connection reset by invalid user admin 45.135.232.92 port 33328 [preauth]
Dec 03 21:05:24 compute-0 iptables.init[62810]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 03 21:05:24 compute-0 iptables.init[62810]: iptables: Flushing firewall rules: [  OK  ]
Dec 03 21:05:24 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Dec 03 21:05:24 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 03 21:05:25 compute-0 sudo[62768]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:25 compute-0 sudo[63006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kacatuporhvxvpzysxbldmxjofldvdkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795925.1868815-209-277337932881654/AnsiballZ_systemd.py'
Dec 03 21:05:25 compute-0 sudo[63006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:25 compute-0 python3.9[63008]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:05:25 compute-0 sudo[63006]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:26 compute-0 sudo[63160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxbzijghvmburlbdnvzgyxenfwprtlic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795926.1026568-225-15330124104436/AnsiballZ_systemd.py'
Dec 03 21:05:26 compute-0 sudo[63160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:26 compute-0 python3.9[63162]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:05:26 compute-0 sshd-session[62837]: Connection reset by authenticating user root 45.135.232.92 port 38398 [preauth]
Dec 03 21:05:26 compute-0 systemd[1]: Reloading.
Dec 03 21:05:26 compute-0 systemd-rc-local-generator[63191]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:05:26 compute-0 systemd-sysv-generator[63194]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:05:27 compute-0 systemd[1]: Starting Netfilter Tables...
Dec 03 21:05:27 compute-0 systemd[1]: Finished Netfilter Tables.
Dec 03 21:05:27 compute-0 sudo[63160]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:27 compute-0 sudo[63354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhdqltvbumfgeenzukpndjsffltnyjyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795927.3201597-233-90587295923582/AnsiballZ_command.py'
Dec 03 21:05:27 compute-0 sudo[63354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:27 compute-0 python3.9[63356]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:05:28 compute-0 sudo[63354]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:28 compute-0 sshd-session[63200]: Invalid user 1234 from 45.135.232.92 port 38400
Dec 03 21:05:28 compute-0 sudo[63507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhxxhzndksqrlpfvjganyysbnrdjpigj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795928.396085-247-33490062234291/AnsiballZ_stat.py'
Dec 03 21:05:28 compute-0 sudo[63507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:29 compute-0 python3.9[63509]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:29 compute-0 sudo[63507]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:29 compute-0 sshd-session[63200]: Connection reset by invalid user 1234 45.135.232.92 port 38400 [preauth]
Dec 03 21:05:29 compute-0 sudo[63633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfgymqvicnbpfqhjxhgmauylmdgvsfxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795928.396085-247-33490062234291/AnsiballZ_copy.py'
Dec 03 21:05:29 compute-0 sudo[63633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:29 compute-0 python3.9[63635]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795928.396085-247-33490062234291/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:29 compute-0 sudo[63633]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:30 compute-0 sudo[63787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwlaogpwtjhepnmrklwsfsynrovapvwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795929.83761-262-263179620340208/AnsiballZ_systemd.py'
Dec 03 21:05:30 compute-0 sudo[63787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:30 compute-0 python3.9[63789]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:05:30 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Dec 03 21:05:30 compute-0 sshd[1007]: Received SIGHUP; restarting.
Dec 03 21:05:30 compute-0 sshd[1007]: Server listening on 0.0.0.0 port 22.
Dec 03 21:05:30 compute-0 sshd[1007]: Server listening on :: port 22.
Dec 03 21:05:30 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Dec 03 21:05:30 compute-0 sudo[63787]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:31 compute-0 sudo[63943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arxpogwryxghekvbpvmysvmhxinsneek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795930.793852-270-279751766651613/AnsiballZ_file.py'
Dec 03 21:05:31 compute-0 sudo[63943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:31 compute-0 sshd-session[63582]: Connection reset by authenticating user root 45.135.232.92 port 38414 [preauth]
Dec 03 21:05:31 compute-0 python3.9[63945]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:31 compute-0 sudo[63943]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:31 compute-0 sudo[64095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuehbrncaxlbgovmugvcwdmabwpzcxuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795931.533693-278-204717928591321/AnsiballZ_stat.py'
Dec 03 21:05:31 compute-0 sudo[64095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:32 compute-0 python3.9[64097]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:32 compute-0 sudo[64095]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:32 compute-0 sudo[64218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phazncwxcgefwvkdqxdyzuzrnrzfvido ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795931.533693-278-204717928591321/AnsiballZ_copy.py'
Dec 03 21:05:32 compute-0 sudo[64218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:32 compute-0 python3.9[64220]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795931.533693-278-204717928591321/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:32 compute-0 sudo[64218]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:33 compute-0 sudo[64370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmcttebuibulpoycweuokpvzxvljwjgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795933.001594-296-144043134559731/AnsiballZ_timezone.py'
Dec 03 21:05:33 compute-0 sudo[64370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:33 compute-0 python3.9[64372]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 03 21:05:33 compute-0 systemd[1]: Starting Time & Date Service...
Dec 03 21:05:33 compute-0 systemd[1]: Started Time & Date Service.
Dec 03 21:05:33 compute-0 sudo[64370]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:34 compute-0 sudo[64526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdmkyulmzznpmnoezylrypzjzjvmcppj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795934.1107783-305-158786204628926/AnsiballZ_file.py'
Dec 03 21:05:34 compute-0 sudo[64526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:34 compute-0 python3.9[64528]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:34 compute-0 sudo[64526]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:35 compute-0 sudo[64678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdpigxakiceopaefzphieotusorvjbuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795934.8472922-313-188161018391394/AnsiballZ_stat.py'
Dec 03 21:05:35 compute-0 sudo[64678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:35 compute-0 python3.9[64680]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:35 compute-0 sudo[64678]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:35 compute-0 sudo[64801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkryaqtnahhipbkqfylvvvinlipkctze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795934.8472922-313-188161018391394/AnsiballZ_copy.py'
Dec 03 21:05:35 compute-0 sudo[64801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:35 compute-0 python3.9[64803]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795934.8472922-313-188161018391394/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:35 compute-0 sudo[64801]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:36 compute-0 sudo[64953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akzfwmjoxycmtkqttvgvsrldbnqjoqce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795936.0547075-328-67765121931819/AnsiballZ_stat.py'
Dec 03 21:05:36 compute-0 sudo[64953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:36 compute-0 python3.9[64955]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:36 compute-0 sudo[64953]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:37 compute-0 sudo[65076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unyvaxqjvtbykeoohfqdbgdomwcvuiab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795936.0547075-328-67765121931819/AnsiballZ_copy.py'
Dec 03 21:05:37 compute-0 sudo[65076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:37 compute-0 python3.9[65078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764795936.0547075-328-67765121931819/.source.yaml _original_basename=.esxxuqjv follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:37 compute-0 sudo[65076]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:37 compute-0 sudo[65228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gywfxaeimxavyqkpjghklkjlnbdugmkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795937.4544232-343-266905661443422/AnsiballZ_stat.py'
Dec 03 21:05:37 compute-0 sudo[65228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:38 compute-0 python3.9[65230]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:38 compute-0 sudo[65228]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:38 compute-0 sudo[65351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhvjhqnltpzmecbkgyvuaggauwuwlmrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795937.4544232-343-266905661443422/AnsiballZ_copy.py'
Dec 03 21:05:38 compute-0 sudo[65351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:38 compute-0 python3.9[65353]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795937.4544232-343-266905661443422/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:38 compute-0 sudo[65351]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:39 compute-0 sudo[65503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvoiehzjfpbdagdbbnddhmibeyqvekzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795938.7622619-358-200829134061024/AnsiballZ_command.py'
Dec 03 21:05:39 compute-0 sudo[65503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:39 compute-0 python3.9[65505]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:05:39 compute-0 sudo[65503]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:39 compute-0 sudo[65656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsbzncpypkalbuegozeltpwiqaprgxwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795939.499697-366-127058862416759/AnsiballZ_command.py'
Dec 03 21:05:39 compute-0 sudo[65656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:40 compute-0 python3.9[65658]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:05:40 compute-0 sudo[65656]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:40 compute-0 sudo[65809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxeqopzlqqhwqggpancewqtrvuqjiinh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764795940.3083472-374-99786316991365/AnsiballZ_edpm_nftables_from_files.py'
Dec 03 21:05:40 compute-0 sudo[65809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:41 compute-0 python3[65811]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 03 21:05:41 compute-0 sudo[65809]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:41 compute-0 sudo[65961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shtqjedlhfrkzezimjhacbxlibxxpunb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795941.24429-382-209878639625021/AnsiballZ_stat.py'
Dec 03 21:05:41 compute-0 sudo[65961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:41 compute-0 python3.9[65963]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:41 compute-0 sudo[65961]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:42 compute-0 sudo[66084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtbwvwnnscqudkfcarnvdbboxkjttfci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795941.24429-382-209878639625021/AnsiballZ_copy.py'
Dec 03 21:05:42 compute-0 sudo[66084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:42 compute-0 python3.9[66086]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795941.24429-382-209878639625021/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:42 compute-0 sudo[66084]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:42 compute-0 sudo[66236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyhxcertckbqhbqihaskmjbuzkamswjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795942.5357482-397-24251246083243/AnsiballZ_stat.py'
Dec 03 21:05:42 compute-0 sudo[66236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:43 compute-0 python3.9[66238]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:43 compute-0 sudo[66236]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:43 compute-0 sudo[66359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkqxjcvwvdnidgeobfadtqhwkcnmbqvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795942.5357482-397-24251246083243/AnsiballZ_copy.py'
Dec 03 21:05:43 compute-0 sudo[66359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:43 compute-0 python3.9[66361]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795942.5357482-397-24251246083243/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:43 compute-0 sudo[66359]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:44 compute-0 sudo[66511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgnzdizofpwqthfiludssercwhgmcckj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795943.7905405-412-280377068001652/AnsiballZ_stat.py'
Dec 03 21:05:44 compute-0 sudo[66511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:44 compute-0 python3.9[66513]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:44 compute-0 sudo[66511]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:44 compute-0 sudo[66634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmvancmlpvzjtfmnmtssjqxweebfnzoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795943.7905405-412-280377068001652/AnsiballZ_copy.py'
Dec 03 21:05:44 compute-0 sudo[66634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:45 compute-0 python3.9[66636]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795943.7905405-412-280377068001652/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:45 compute-0 sudo[66634]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:45 compute-0 sudo[66786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmbqrmuaztdcdxaracogxyatgpbwvnmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795945.3336186-427-96654835139848/AnsiballZ_stat.py'
Dec 03 21:05:45 compute-0 sudo[66786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:45 compute-0 python3.9[66788]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:45 compute-0 sudo[66786]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:46 compute-0 sudo[66909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkumtchfrgyrasugpbzparhaieqcehlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795945.3336186-427-96654835139848/AnsiballZ_copy.py'
Dec 03 21:05:46 compute-0 sudo[66909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:46 compute-0 python3.9[66911]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795945.3336186-427-96654835139848/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:46 compute-0 sudo[66909]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:47 compute-0 sudo[67061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-texpbbufrdohsekgewfigxgzvukllrrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795946.6427174-442-20870292554775/AnsiballZ_stat.py'
Dec 03 21:05:47 compute-0 sudo[67061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:47 compute-0 python3.9[67063]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:05:47 compute-0 sudo[67061]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:47 compute-0 sudo[67184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvnvctqjvehytzjeupgnhzkavyezdwig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795946.6427174-442-20870292554775/AnsiballZ_copy.py'
Dec 03 21:05:47 compute-0 sudo[67184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:47 compute-0 python3.9[67186]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764795946.6427174-442-20870292554775/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:47 compute-0 sudo[67184]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:48 compute-0 sudo[67336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwkknurnqagfjzkwxeiprtdywismtvmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795948.070501-457-227972341159068/AnsiballZ_file.py'
Dec 03 21:05:48 compute-0 sudo[67336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:48 compute-0 python3.9[67338]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:48 compute-0 sudo[67336]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:49 compute-0 sudo[67488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yddiaplmijhhxwgynzquvryazecbshea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795948.703195-465-54235535248315/AnsiballZ_command.py'
Dec 03 21:05:49 compute-0 sudo[67488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:49 compute-0 python3.9[67490]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:05:49 compute-0 sudo[67488]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:50 compute-0 sudo[67647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hujjgylkbykbpxdfecejxshqbjodsgoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795949.5035682-473-113393585725595/AnsiballZ_blockinfile.py'
Dec 03 21:05:50 compute-0 sudo[67647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:50 compute-0 python3.9[67649]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:50 compute-0 sudo[67647]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:50 compute-0 sudo[67800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jemxgnezdavkmwtbblhbbwnhgqdrrxxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795950.623073-482-190310049309817/AnsiballZ_file.py'
Dec 03 21:05:50 compute-0 sudo[67800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:51 compute-0 python3.9[67802]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:51 compute-0 sudo[67800]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:51 compute-0 sudo[67952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvprioxucczlqbusckykxreuyhdpawer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795951.375291-482-139756755861750/AnsiballZ_file.py'
Dec 03 21:05:51 compute-0 sudo[67952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:51 compute-0 python3.9[67954]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:05:51 compute-0 sudo[67952]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:52 compute-0 sudo[68104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwafzorjeafaamwjmlahdmypycumqhhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795952.1579008-497-186550420656083/AnsiballZ_mount.py'
Dec 03 21:05:52 compute-0 sudo[68104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:52 compute-0 python3.9[68106]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 03 21:05:52 compute-0 sudo[68104]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:53 compute-0 sudo[68257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otvlpxgunfeojwfqoiehhrqvrrdisntt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795953.1237214-497-131437228864796/AnsiballZ_mount.py'
Dec 03 21:05:53 compute-0 sudo[68257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:53 compute-0 python3.9[68259]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 03 21:05:53 compute-0 sudo[68257]: pam_unix(sudo:session): session closed for user root
Dec 03 21:05:54 compute-0 sshd-session[59045]: Connection closed by 192.168.122.30 port 42376
Dec 03 21:05:54 compute-0 sshd-session[59042]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:05:54 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Dec 03 21:05:54 compute-0 systemd[1]: session-13.scope: Consumed 39.295s CPU time.
Dec 03 21:05:54 compute-0 systemd-logind[787]: Session 13 logged out. Waiting for processes to exit.
Dec 03 21:05:54 compute-0 systemd-logind[787]: Removed session 13.
Dec 03 21:05:59 compute-0 sshd-session[68285]: Accepted publickey for zuul from 192.168.122.30 port 33106 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:05:59 compute-0 systemd-logind[787]: New session 14 of user zuul.
Dec 03 21:05:59 compute-0 systemd[1]: Started Session 14 of User zuul.
Dec 03 21:05:59 compute-0 sshd-session[68285]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:05:59 compute-0 sudo[68438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scwzrunqhnvyxigntevfjukcegjwzuon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795959.1806111-16-226529466564353/AnsiballZ_tempfile.py'
Dec 03 21:05:59 compute-0 sudo[68438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:05:59 compute-0 python3.9[68440]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 03 21:05:59 compute-0 sudo[68438]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:00 compute-0 sudo[68590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdccyjegqwqhrczdhaplqfoqperkdage ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795960.1841018-28-228199317067382/AnsiballZ_stat.py'
Dec 03 21:06:00 compute-0 sudo[68590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:00 compute-0 python3.9[68592]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:06:00 compute-0 sudo[68590]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:01 compute-0 sudo[68742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsczeejwfgjbcorcfzwedwomvrnpevvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795961.2292213-38-40239387661673/AnsiballZ_setup.py'
Dec 03 21:06:01 compute-0 sudo[68742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:02 compute-0 python3.9[68744]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:06:02 compute-0 sudo[68742]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:02 compute-0 sudo[68894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbiykqvjxxpavfwcnibbnijbceikfplr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795962.4718046-47-268471861695249/AnsiballZ_blockinfile.py'
Dec 03 21:06:02 compute-0 sudo[68894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:03 compute-0 python3.9[68896]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/QcnFnE07R2H02WXa+53W3W+nwsFsC4YoQpDZUgxEwlg4f2zQf8fQIG23b5h9N8ej11I+FwfST4eb14wdXsFBAm6rVbCzkwQOmaDc1DdRfSmSFzwYKgqnejjeunc7W9ASRY8ZFAX/dexoruuzsoDFSnT/YK2DiUDLCoWmwO4mZ946GvsVF6yCywprEQo/oFdVyYbYBvGnl2hb9O06ePH8wQRx2BT7GKvzyv0j8Dz3LjXOzrd+jB7UlvodWIaHPlQhq/S/ZDfA640mfL7TSk/VRKvnWyi4m3+Gbj0A92cO36Objq1V2W1DPen5Nzv5CbZRHNjBvVR9G0jGLdsP8sWtUhe2qfiLZlAx0Cn0ZIhzPbS2Ij3lgp1Otug1NK15JYpiz9z0JO+UgfdZ9ht6yAYnsMcQ4OaFvKqWmsOxrx76BJ8s3hQuBMrZL+YgtbDswJVFn9/ay22MQ+ntCLeQL6GPb6WQJGnnWYqSlUX3e8wBllkbHrFK1/iyfqWjrHwteK8=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINtkoZCmFpb3z8TzbldoOvjALaFBxUWmFrtA4oHE040r
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAzVjP1T+0nWOYuc0KdOyqtmhcGoQseIckbkxVi0stL4dfIoBsNFyujIS49nno21BKZJb6EV/fwil4CuPgbMlGg=
                                             create=True mode=0644 path=/tmp/ansible.i42xset5 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:06:03 compute-0 sudo[68894]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:03 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 03 21:06:03 compute-0 sudo[69048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koootgmxcslnwyqjdajwkebvlgrbgqwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795963.4009469-55-68460286360222/AnsiballZ_command.py'
Dec 03 21:06:03 compute-0 sudo[69048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:04 compute-0 python3.9[69050]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.i42xset5' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:06:04 compute-0 sudo[69048]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:04 compute-0 sudo[69202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpvucpkoacbhhitfhuoajqxedhhdjtzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795964.3269403-63-85917166599299/AnsiballZ_file.py'
Dec 03 21:06:04 compute-0 sudo[69202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:05 compute-0 python3.9[69204]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.i42xset5 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:06:05 compute-0 sudo[69202]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:05 compute-0 sshd-session[68288]: Connection closed by 192.168.122.30 port 33106
Dec 03 21:06:05 compute-0 sshd-session[68285]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:06:05 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Dec 03 21:06:05 compute-0 systemd[1]: session-14.scope: Consumed 4.103s CPU time.
Dec 03 21:06:05 compute-0 systemd-logind[787]: Session 14 logged out. Waiting for processes to exit.
Dec 03 21:06:05 compute-0 systemd-logind[787]: Removed session 14.
Dec 03 21:06:11 compute-0 sshd-session[69229]: Accepted publickey for zuul from 192.168.122.30 port 47768 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:06:11 compute-0 systemd-logind[787]: New session 15 of user zuul.
Dec 03 21:06:11 compute-0 systemd[1]: Started Session 15 of User zuul.
Dec 03 21:06:11 compute-0 sshd-session[69229]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:06:12 compute-0 python3.9[69382]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:06:13 compute-0 sudo[69536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jblzahjjokwcgwjehumrkgthrvkeiigo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795972.9579465-32-179779343186432/AnsiballZ_systemd.py'
Dec 03 21:06:13 compute-0 sudo[69536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:14 compute-0 python3.9[69538]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 03 21:06:14 compute-0 sudo[69536]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:14 compute-0 sudo[69690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwwqibsgemdrhgddblilmdqpmcmlmmaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795974.4717343-40-184070814995373/AnsiballZ_systemd.py'
Dec 03 21:06:14 compute-0 sudo[69690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:15 compute-0 python3.9[69692]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:06:15 compute-0 sudo[69690]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:15 compute-0 sudo[69843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjnylurrxmwgjjkbaawyvwgnbumuhvaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795975.3748152-49-91514868943395/AnsiballZ_command.py'
Dec 03 21:06:15 compute-0 sudo[69843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:16 compute-0 python3.9[69845]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:06:16 compute-0 sudo[69843]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:16 compute-0 sudo[69996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tccroanbnfwvnoauoiyuqtptjcfnaqng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795976.341956-57-166052143368443/AnsiballZ_stat.py'
Dec 03 21:06:16 compute-0 sudo[69996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:17 compute-0 python3.9[69998]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:06:17 compute-0 sudo[69996]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:17 compute-0 sudo[70150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfhketuosveduwgrjiyaddozmzydocua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795977.3153994-65-174145152156309/AnsiballZ_command.py'
Dec 03 21:06:17 compute-0 sudo[70150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:17 compute-0 python3.9[70152]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:06:17 compute-0 sudo[70150]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:18 compute-0 sudo[70305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyvhpsjvupesrdcjbbgceibeooiudkhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795978.1727352-73-5845874987217/AnsiballZ_file.py'
Dec 03 21:06:18 compute-0 sudo[70305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:18 compute-0 python3.9[70307]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:06:18 compute-0 sudo[70305]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:19 compute-0 sshd-session[69232]: Connection closed by 192.168.122.30 port 47768
Dec 03 21:06:19 compute-0 sshd-session[69229]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:06:19 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Dec 03 21:06:19 compute-0 systemd-logind[787]: Session 15 logged out. Waiting for processes to exit.
Dec 03 21:06:19 compute-0 systemd[1]: session-15.scope: Consumed 5.360s CPU time.
Dec 03 21:06:19 compute-0 systemd-logind[787]: Removed session 15.
Dec 03 21:06:24 compute-0 sshd-session[70332]: Accepted publickey for zuul from 192.168.122.30 port 48364 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:06:24 compute-0 systemd-logind[787]: New session 16 of user zuul.
Dec 03 21:06:24 compute-0 systemd[1]: Started Session 16 of User zuul.
Dec 03 21:06:24 compute-0 sshd-session[70332]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:06:25 compute-0 python3.9[70485]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:06:26 compute-0 sudo[70639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aumehigiejcplnbzieashthpeloepvjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795986.340853-34-223869117223274/AnsiballZ_setup.py'
Dec 03 21:06:26 compute-0 sudo[70639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:27 compute-0 python3.9[70641]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:06:27 compute-0 sudo[70639]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:27 compute-0 sudo[70723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krpzqzayavwecjtcyxxhiunvppnohjyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764795986.340853-34-223869117223274/AnsiballZ_dnf.py'
Dec 03 21:06:27 compute-0 sudo[70723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:27 compute-0 python3.9[70725]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 03 21:06:29 compute-0 sudo[70723]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:30 compute-0 python3.9[70876]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:06:31 compute-0 python3.9[71027]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 03 21:06:32 compute-0 python3.9[71177]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:06:32 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:06:32 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:06:33 compute-0 python3.9[71328]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:06:33 compute-0 sshd-session[70335]: Connection closed by 192.168.122.30 port 48364
Dec 03 21:06:33 compute-0 sshd-session[70332]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:06:33 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Dec 03 21:06:33 compute-0 systemd[1]: session-16.scope: Consumed 6.375s CPU time.
Dec 03 21:06:33 compute-0 systemd-logind[787]: Session 16 logged out. Waiting for processes to exit.
Dec 03 21:06:33 compute-0 systemd-logind[787]: Removed session 16.
Dec 03 21:06:41 compute-0 sshd-session[71354]: Accepted publickey for zuul from 38.102.83.47 port 55182 ssh2: RSA SHA256:Cbt6DRjvlxgKyw9DjjqWJJ3+P4VAN6Cwz5dn2cu8Cgg
Dec 03 21:06:41 compute-0 systemd-logind[787]: New session 17 of user zuul.
Dec 03 21:06:41 compute-0 systemd[1]: Started Session 17 of User zuul.
Dec 03 21:06:41 compute-0 sshd-session[71354]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:06:41 compute-0 sudo[71430]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlnkwfeksvjjgcfirihmdjcsicbzgtam ; /usr/bin/python3'
Dec 03 21:06:41 compute-0 sudo[71430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:41 compute-0 useradd[71434]: new group: name=ceph-admin, GID=42478
Dec 03 21:06:41 compute-0 useradd[71434]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Dec 03 21:06:41 compute-0 sudo[71430]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:42 compute-0 sudo[71516]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbzlyxhdyjqqhnobpmtqkmnmwbwyqbkk ; /usr/bin/python3'
Dec 03 21:06:42 compute-0 sudo[71516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:42 compute-0 sudo[71516]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:42 compute-0 sudo[71589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgszbibfhjuvhrxnydkutgxjyshkxhmm ; /usr/bin/python3'
Dec 03 21:06:42 compute-0 sudo[71589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:42 compute-0 sudo[71589]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:43 compute-0 sudo[71639]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evvyacybigictisrmdkkrkdlzjxthphf ; /usr/bin/python3'
Dec 03 21:06:43 compute-0 sudo[71639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:43 compute-0 sudo[71639]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:43 compute-0 sudo[71665]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgcwwtzbdsxfhgwjgqwxmppseqhyibix ; /usr/bin/python3'
Dec 03 21:06:43 compute-0 sudo[71665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:43 compute-0 sudo[71665]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:43 compute-0 sudo[71691]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnizlcnkonymsbfhlblxgarmfyqoartg ; /usr/bin/python3'
Dec 03 21:06:43 compute-0 sudo[71691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:43 compute-0 sudo[71691]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:44 compute-0 sudo[71717]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sndpujctwbiolsmfhtkoseazazxcqzlg ; /usr/bin/python3'
Dec 03 21:06:44 compute-0 sudo[71717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:44 compute-0 sudo[71717]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:44 compute-0 sudo[71795]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdrxpfishovthfpsqhkbfngjnxktnsoj ; /usr/bin/python3'
Dec 03 21:06:44 compute-0 sudo[71795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:44 compute-0 sudo[71795]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:45 compute-0 sudo[71868]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhcymbneeqgphrvmmznuaxygjzrkmucd ; /usr/bin/python3'
Dec 03 21:06:45 compute-0 sudo[71868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:45 compute-0 sudo[71868]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:45 compute-0 sudo[71970]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfzgyvkwvwcpgdeazsvknjpyslwpzbxg ; /usr/bin/python3'
Dec 03 21:06:45 compute-0 sudo[71970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:45 compute-0 sudo[71970]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:46 compute-0 sudo[72043]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxvxcfytacsjnirjmsswargbpsncacfi ; /usr/bin/python3'
Dec 03 21:06:46 compute-0 sudo[72043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:46 compute-0 sudo[72043]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:46 compute-0 sudo[72093]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yavayjrravwfnizblsbhqwohzlztifkf ; /usr/bin/python3'
Dec 03 21:06:46 compute-0 sudo[72093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:47 compute-0 python3[72095]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:06:47 compute-0 sudo[72093]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:48 compute-0 sudo[72188]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfkqlbchmqeaubllamsoxepleceocukk ; /usr/bin/python3'
Dec 03 21:06:48 compute-0 sudo[72188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:48 compute-0 python3[72190]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 03 21:06:49 compute-0 sudo[72188]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:50 compute-0 sudo[72215]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erzooyuzaaxwcbzjjnxtmvkcoarcajtg ; /usr/bin/python3'
Dec 03 21:06:50 compute-0 sudo[72215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:50 compute-0 python3[72217]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 21:06:50 compute-0 sudo[72215]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:50 compute-0 sudo[72241]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pervppvtrliitizqpkkvvbfgpvvosyfc ; /usr/bin/python3'
Dec 03 21:06:50 compute-0 sudo[72241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:50 compute-0 python3[72243]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:06:50 compute-0 kernel: loop: module loaded
Dec 03 21:06:50 compute-0 kernel: loop3: detected capacity change from 0 to 41943040
Dec 03 21:06:50 compute-0 sudo[72241]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:50 compute-0 sudo[72276]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqxalsbstxzcwaepanmiidajvugzgbxq ; /usr/bin/python3'
Dec 03 21:06:50 compute-0 sudo[72276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:50 compute-0 python3[72278]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:06:51 compute-0 lvm[72281]: PV /dev/loop3 not used.
Dec 03 21:06:51 compute-0 lvm[72283]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:06:51 compute-0 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec 03 21:06:51 compute-0 lvm[72293]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:06:51 compute-0 lvm[72293]: VG ceph_vg0 finished
Dec 03 21:06:51 compute-0 lvm[72291]:   1 logical volume(s) in volume group "ceph_vg0" now active
Dec 03 21:06:51 compute-0 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec 03 21:06:51 compute-0 sudo[72276]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:51 compute-0 sudo[72369]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbcegeabgbxhxydzsxhgrqxresxdugka ; /usr/bin/python3'
Dec 03 21:06:51 compute-0 sudo[72369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:51 compute-0 python3[72371]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 21:06:51 compute-0 sudo[72369]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:52 compute-0 sudo[72442]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yahsvbtftodtqooufmynvwwylmnstkts ; /usr/bin/python3'
Dec 03 21:06:52 compute-0 sudo[72442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:52 compute-0 python3[72444]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796011.4022021-36343-12362086634002/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:06:52 compute-0 sudo[72442]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:52 compute-0 sudo[72492]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nazwupitedtgrvagudroodwbzcpofygx ; /usr/bin/python3'
Dec 03 21:06:52 compute-0 sudo[72492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:52 compute-0 python3[72494]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:06:52 compute-0 systemd[1]: Reloading.
Dec 03 21:06:53 compute-0 systemd-rc-local-generator[72519]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:06:53 compute-0 systemd-sysv-generator[72525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:06:53 compute-0 systemd[1]: Starting Ceph OSD losetup...
Dec 03 21:06:53 compute-0 bash[72533]: /dev/loop3: [64513]:4327948 (/var/lib/ceph-osd-0.img)
Dec 03 21:06:53 compute-0 systemd[1]: Finished Ceph OSD losetup.
Dec 03 21:06:53 compute-0 sudo[72492]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:53 compute-0 lvm[72534]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:06:53 compute-0 lvm[72534]: VG ceph_vg0 finished
Dec 03 21:06:53 compute-0 sudo[72558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsjppaosjezafwjzazykqbavhnicllxu ; /usr/bin/python3'
Dec 03 21:06:53 compute-0 sudo[72558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:53 compute-0 python3[72560]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 03 21:06:54 compute-0 chronyd[58561]: Selected source 162.159.200.123 (pool.ntp.org)
Dec 03 21:06:54 compute-0 sudo[72558]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:55 compute-0 sudo[72585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxokoqnrcticjvrdbbgknjkikouzplkv ; /usr/bin/python3'
Dec 03 21:06:55 compute-0 sudo[72585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:55 compute-0 python3[72587]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 21:06:55 compute-0 sudo[72585]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:55 compute-0 sudo[72611]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nybxsgxacxaothjoseqymsylqmtchylf ; /usr/bin/python3'
Dec 03 21:06:55 compute-0 sudo[72611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:55 compute-0 python3[72613]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=20G
                                          losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:06:55 compute-0 kernel: loop4: detected capacity change from 0 to 41943040
Dec 03 21:06:55 compute-0 sudo[72611]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:55 compute-0 sudo[72643]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wajqfjnoafqgpzpnsbappjopisqhjihv ; /usr/bin/python3'
Dec 03 21:06:55 compute-0 sudo[72643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:55 compute-0 python3[72645]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                          vgcreate ceph_vg1 /dev/loop4
                                          lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:06:55 compute-0 lvm[72648]: PV /dev/loop4 not used.
Dec 03 21:06:56 compute-0 lvm[72657]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:06:56 compute-0 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Dec 03 21:06:56 compute-0 sudo[72643]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:56 compute-0 lvm[72659]:   1 logical volume(s) in volume group "ceph_vg1" now active
Dec 03 21:06:56 compute-0 systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Dec 03 21:06:56 compute-0 sudo[72735]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uykzjizfdnxcdkgrjadjvrgtiiolswld ; /usr/bin/python3'
Dec 03 21:06:56 compute-0 sudo[72735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:56 compute-0 python3[72737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 21:06:56 compute-0 sudo[72735]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:56 compute-0 sudo[72808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuxgmkkazglxsvevfxsywrreyfwkldwu ; /usr/bin/python3'
Dec 03 21:06:56 compute-0 sudo[72808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:56 compute-0 python3[72810]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796016.2793815-36370-175935688238143/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:06:57 compute-0 sudo[72808]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:57 compute-0 sudo[72858]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdfxogvhswiuhgtezzvywghrnyukkcse ; /usr/bin/python3'
Dec 03 21:06:57 compute-0 sudo[72858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:57 compute-0 python3[72860]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:06:57 compute-0 systemd[1]: Reloading.
Dec 03 21:06:57 compute-0 systemd-rc-local-generator[72887]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:06:57 compute-0 systemd-sysv-generator[72893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:06:57 compute-0 systemd[1]: Starting Ceph OSD losetup...
Dec 03 21:06:57 compute-0 bash[72900]: /dev/loop4: [64513]:4327955 (/var/lib/ceph-osd-1.img)
Dec 03 21:06:57 compute-0 systemd[1]: Finished Ceph OSD losetup.
Dec 03 21:06:57 compute-0 sudo[72858]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:57 compute-0 lvm[72901]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:06:57 compute-0 lvm[72901]: VG ceph_vg1 finished
Dec 03 21:06:58 compute-0 sudo[72925]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkfcekghfbzjahppijllrefzzhigebvr ; /usr/bin/python3'
Dec 03 21:06:58 compute-0 sudo[72925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:58 compute-0 python3[72927]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 03 21:06:59 compute-0 sudo[72925]: pam_unix(sudo:session): session closed for user root
Dec 03 21:06:59 compute-0 sudo[72952]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siiibtnhfcrvehmowwtidfwfyhyaglxp ; /usr/bin/python3'
Dec 03 21:06:59 compute-0 sudo[72952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:06:59 compute-0 python3[72954]: ansible-ansible.builtin.stat Invoked with path=/dev/loop5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 21:06:59 compute-0 sudo[72952]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:00 compute-0 sudo[72978]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhfnelogdmalrzjuuhufwuklkvveiwgl ; /usr/bin/python3'
Dec 03 21:07:00 compute-0 sudo[72978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:00 compute-0 python3[72980]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-2.img bs=1 count=0 seek=20G
                                          losetup /dev/loop5 /var/lib/ceph-osd-2.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:07:00 compute-0 kernel: loop5: detected capacity change from 0 to 41943040
Dec 03 21:07:00 compute-0 sudo[72978]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:00 compute-0 sudo[73010]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lycugddfoqsowtawxpwcdmebybjvpryv ; /usr/bin/python3'
Dec 03 21:07:00 compute-0 sudo[73010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:00 compute-0 python3[73012]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop5
                                          vgcreate ceph_vg2 /dev/loop5
                                          lvcreate -n ceph_lv2 -l +100%FREE ceph_vg2
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:07:00 compute-0 lvm[73015]: PV /dev/loop5 not used.
Dec 03 21:07:00 compute-0 lvm[73017]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:07:00 compute-0 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg2.
Dec 03 21:07:00 compute-0 lvm[73027]:   1 logical volume(s) in volume group "ceph_vg2" now active
Dec 03 21:07:00 compute-0 systemd[1]: lvm-activate-ceph_vg2.service: Deactivated successfully.
Dec 03 21:07:00 compute-0 sudo[73010]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:01 compute-0 sudo[73103]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvgeqclvzxxbvsfueywyjgylcuirusgx ; /usr/bin/python3'
Dec 03 21:07:01 compute-0 sudo[73103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:01 compute-0 python3[73105]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-2.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 21:07:01 compute-0 sudo[73103]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:01 compute-0 sudo[73176]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxxwmqvgspofxrefdttynuegzbtedkgo ; /usr/bin/python3'
Dec 03 21:07:01 compute-0 sudo[73176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:01 compute-0 python3[73178]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796021.059373-36401-200170879901882/source dest=/etc/systemd/system/ceph-osd-losetup-2.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=4c5b1bc5693c499ffe2edaa97d63f5df7075d845 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:07:01 compute-0 sudo[73176]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:02 compute-0 sudo[73226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtkwcwpervzunfpxpmbwevmiyspmufvs ; /usr/bin/python3'
Dec 03 21:07:02 compute-0 sudo[73226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:02 compute-0 python3[73228]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-2.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:07:02 compute-0 systemd[1]: Reloading.
Dec 03 21:07:02 compute-0 systemd-rc-local-generator[73259]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:07:02 compute-0 systemd-sysv-generator[73263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:07:02 compute-0 systemd[1]: Starting Ceph OSD losetup...
Dec 03 21:07:02 compute-0 bash[73268]: /dev/loop5: [64513]:4327966 (/var/lib/ceph-osd-2.img)
Dec 03 21:07:02 compute-0 systemd[1]: Finished Ceph OSD losetup.
Dec 03 21:07:02 compute-0 sudo[73226]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:02 compute-0 lvm[73269]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:07:02 compute-0 lvm[73269]: VG ceph_vg2 finished
Dec 03 21:07:04 compute-0 python3[73293]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:07:06 compute-0 sudo[73384]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggbvqgierojeslglpwtuzrqhzlflebcu ; /usr/bin/python3'
Dec 03 21:07:06 compute-0 sudo[73384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:06 compute-0 python3[73386]: ansible-ansible.legacy.dnf Invoked with name=['centos-release-ceph-tentacle'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 03 21:07:09 compute-0 sudo[73384]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:09 compute-0 sudo[73441]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rezchfeizjesxfozpdwpmmnhgvhawzfy ; /usr/bin/python3'
Dec 03 21:07:09 compute-0 sudo[73441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:09 compute-0 python3[73443]: ansible-ansible.legacy.dnf Invoked with name=['cephadm'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 03 21:07:13 compute-0 groupadd[73453]: group added to /etc/group: name=cephadm, GID=992
Dec 03 21:07:13 compute-0 groupadd[73453]: group added to /etc/gshadow: name=cephadm
Dec 03 21:07:13 compute-0 groupadd[73453]: new group: name=cephadm, GID=992
Dec 03 21:07:13 compute-0 useradd[73460]: new user: name=cephadm, UID=992, GID=992, home=/var/lib/cephadm, shell=/bin/bash, from=none
Dec 03 21:07:14 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 21:07:14 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 03 21:07:14 compute-0 sudo[73441]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:14 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 21:07:14 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 03 21:07:14 compute-0 systemd[1]: run-r8c46d8a96fdb4872aec1feda2bd47b9c.service: Deactivated successfully.
Dec 03 21:07:14 compute-0 sudo[73560]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahnafgogpftvlhbwrcjowzdgaeopbxmq ; /usr/bin/python3'
Dec 03 21:07:14 compute-0 sudo[73560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:14 compute-0 python3[73562]: ansible-ansible.builtin.stat Invoked with path=/usr/sbin/cephadm follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 21:07:14 compute-0 sudo[73560]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:15 compute-0 sudo[73588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgiewlafrdrwpuptzcafxpvigsiloxdw ; /usr/bin/python3'
Dec 03 21:07:15 compute-0 sudo[73588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:15 compute-0 python3[73590]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/cephadm ls --no-detail _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:07:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:15 compute-0 sudo[73588]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:16 compute-0 sudo[73625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caqbmdnkxvpciarkeykwawrtzwtbmwty ; /usr/bin/python3'
Dec 03 21:07:16 compute-0 sudo[73625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:16 compute-0 python3[73627]: ansible-ansible.builtin.file Invoked with path=/etc/ceph state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:07:16 compute-0 sudo[73625]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:16 compute-0 sudo[73651]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xejxzqywsoptpwtllgxgpvpmtbkompws ; /usr/bin/python3'
Dec 03 21:07:16 compute-0 sudo[73651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:16 compute-0 python3[73653]: ansible-ansible.builtin.file Invoked with path=/home/ceph-admin/specs owner=ceph-admin group=ceph-admin mode=0755 state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:07:16 compute-0 sudo[73651]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:17 compute-0 sudo[73729]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpbuohyqjpkockrduivcdvzdkwtwodyy ; /usr/bin/python3'
Dec 03 21:07:17 compute-0 sudo[73729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:17 compute-0 python3[73731]: ansible-ansible.legacy.stat Invoked with path=/home/ceph-admin/specs/ceph_spec.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 21:07:17 compute-0 sudo[73729]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:17 compute-0 sudo[73802]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvwpcvdigtvulvaebpyoovecijxasjtb ; /usr/bin/python3'
Dec 03 21:07:17 compute-0 sudo[73802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:17 compute-0 python3[73804]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796037.2364986-36549-6794082631691/source dest=/home/ceph-admin/specs/ceph_spec.yaml owner=ceph-admin group=ceph-admin mode=0644 _original_basename=ceph_spec.yml follow=False checksum=bb83c53af4ffd926a3f1eafe26a8be437df6401f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:07:17 compute-0 sudo[73802]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:18 compute-0 sudo[73904]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugvpbnwembbsksavwtzrmvwntxndmgxy ; /usr/bin/python3'
Dec 03 21:07:18 compute-0 sudo[73904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:18 compute-0 python3[73906]: ansible-ansible.legacy.stat Invoked with path=/home/ceph-admin/assimilate_ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 21:07:18 compute-0 sudo[73904]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:18 compute-0 sudo[73977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vztobjeuzivdjhxhahtwzhckznkxuhmu ; /usr/bin/python3'
Dec 03 21:07:18 compute-0 sudo[73977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:19 compute-0 python3[73979]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796038.4235067-36567-102611996598372/source dest=/home/ceph-admin/assimilate_ceph.conf owner=ceph-admin group=ceph-admin mode=0644 _original_basename=initial_ceph.conf follow=False checksum=41828f7c2442fdf376911255e33c12863fc3b1b3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:07:19 compute-0 sudo[73977]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:19 compute-0 sudo[74027]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzrjerqfevyvwnjhwysarfdqsvgfxaqi ; /usr/bin/python3'
Dec 03 21:07:19 compute-0 sudo[74027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:19 compute-0 python3[74029]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/.ssh/id_rsa follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 21:07:19 compute-0 sudo[74027]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:19 compute-0 sudo[74055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvwvumhmwizugoplwstmdfinmjqzymjg ; /usr/bin/python3'
Dec 03 21:07:19 compute-0 sudo[74055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:19 compute-0 python3[74057]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/.ssh/id_rsa.pub follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 21:07:19 compute-0 sudo[74055]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:20 compute-0 sudo[74083]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozdhzdredqteolqlgfekvetjsuywdcqr ; /usr/bin/python3'
Dec 03 21:07:20 compute-0 sudo[74083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:20 compute-0 python3[74085]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/assimilate_ceph.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 21:07:20 compute-0 sudo[74083]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:20 compute-0 sudo[74111]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nylzxddhhbttialnzfmjcvewopirdfff ; /usr/bin/python3'
Dec 03 21:07:20 compute-0 sudo[74111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:07:20 compute-0 python3[74113]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/cephadm bootstrap --skip-firewalld --ssh-private-key /home/ceph-admin/.ssh/id_rsa --ssh-public-key /home/ceph-admin/.ssh/id_rsa.pub --ssh-user ceph-admin --allow-fqdn-hostname --output-keyring /etc/ceph/ceph.client.admin.keyring --output-config /etc/ceph/ceph.conf --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config /home/ceph-admin/assimilate_ceph.conf \--single-host-defaults \--skip-monitoring-stack --skip-dashboard --mon-ip 192.168.122.100
                                           _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:07:20 compute-0 sshd-session[74117]: Accepted publickey for ceph-admin from 192.168.122.100 port 55742 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:07:20 compute-0 systemd[1]: Created slice User Slice of UID 42477.
Dec 03 21:07:20 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec 03 21:07:20 compute-0 systemd-logind[787]: New session 18 of user ceph-admin.
Dec 03 21:07:21 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec 03 21:07:21 compute-0 systemd[1]: Starting User Manager for UID 42477...
Dec 03 21:07:21 compute-0 systemd[74121]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:07:21 compute-0 systemd[74121]: Queued start job for default target Main User Target.
Dec 03 21:07:21 compute-0 systemd[74121]: Created slice User Application Slice.
Dec 03 21:07:21 compute-0 systemd[74121]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 03 21:07:21 compute-0 systemd[74121]: Started Daily Cleanup of User's Temporary Directories.
Dec 03 21:07:21 compute-0 systemd[74121]: Reached target Paths.
Dec 03 21:07:21 compute-0 systemd[74121]: Reached target Timers.
Dec 03 21:07:21 compute-0 systemd[74121]: Starting D-Bus User Message Bus Socket...
Dec 03 21:07:21 compute-0 systemd[74121]: Starting Create User's Volatile Files and Directories...
Dec 03 21:07:21 compute-0 systemd[74121]: Finished Create User's Volatile Files and Directories.
Dec 03 21:07:21 compute-0 systemd[74121]: Listening on D-Bus User Message Bus Socket.
Dec 03 21:07:21 compute-0 systemd[74121]: Reached target Sockets.
Dec 03 21:07:21 compute-0 systemd[74121]: Reached target Basic System.
Dec 03 21:07:21 compute-0 systemd[74121]: Reached target Main User Target.
Dec 03 21:07:21 compute-0 systemd[74121]: Startup finished in 148ms.
Dec 03 21:07:21 compute-0 systemd[1]: Started User Manager for UID 42477.
Dec 03 21:07:21 compute-0 systemd[1]: Started Session 18 of User ceph-admin.
Dec 03 21:07:21 compute-0 sshd-session[74117]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:07:21 compute-0 sudo[74138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/echo
Dec 03 21:07:21 compute-0 sudo[74138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:07:21 compute-0 sudo[74138]: pam_unix(sudo:session): session closed for user root
Dec 03 21:07:21 compute-0 sshd-session[74137]: Received disconnect from 192.168.122.100 port 55742:11: disconnected by user
Dec 03 21:07:21 compute-0 sshd-session[74137]: Disconnected from user ceph-admin 192.168.122.100 port 55742
Dec 03 21:07:21 compute-0 sshd-session[74117]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 03 21:07:21 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Dec 03 21:07:21 compute-0 systemd-logind[787]: Session 18 logged out. Waiting for processes to exit.
Dec 03 21:07:21 compute-0 systemd-logind[787]: Removed session 18.
Dec 03 21:07:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat525026094-merged.mount: Deactivated successfully.
Dec 03 21:07:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat525026094-lower\x2dmapped.mount: Deactivated successfully.
Dec 03 21:07:31 compute-0 systemd[1]: Stopping User Manager for UID 42477...
Dec 03 21:07:31 compute-0 systemd[74121]: Activating special unit Exit the Session...
Dec 03 21:07:31 compute-0 systemd[74121]: Stopped target Main User Target.
Dec 03 21:07:31 compute-0 systemd[74121]: Stopped target Basic System.
Dec 03 21:07:31 compute-0 systemd[74121]: Stopped target Paths.
Dec 03 21:07:31 compute-0 systemd[74121]: Stopped target Sockets.
Dec 03 21:07:31 compute-0 systemd[74121]: Stopped target Timers.
Dec 03 21:07:31 compute-0 systemd[74121]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 03 21:07:31 compute-0 systemd[74121]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 03 21:07:31 compute-0 systemd[74121]: Closed D-Bus User Message Bus Socket.
Dec 03 21:07:31 compute-0 systemd[74121]: Stopped Create User's Volatile Files and Directories.
Dec 03 21:07:31 compute-0 systemd[74121]: Removed slice User Application Slice.
Dec 03 21:07:31 compute-0 systemd[74121]: Reached target Shutdown.
Dec 03 21:07:31 compute-0 systemd[74121]: Finished Exit the Session.
Dec 03 21:07:31 compute-0 systemd[74121]: Reached target Exit the Session.
Dec 03 21:07:31 compute-0 systemd[1]: user@42477.service: Deactivated successfully.
Dec 03 21:07:31 compute-0 systemd[1]: Stopped User Manager for UID 42477.
Dec 03 21:07:31 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec 03 21:07:31 compute-0 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec 03 21:07:31 compute-0 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec 03 21:07:31 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec 03 21:07:31 compute-0 systemd[1]: Removed slice User Slice of UID 42477.
Dec 03 21:07:52 compute-0 podman[74215]: 2025-12-03 21:07:52.662207341 +0000 UTC m=+31.029170905 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:52 compute-0 podman[74315]: 2025-12-03 21:07:52.725241966 +0000 UTC m=+0.041476640 container create cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105 (image=quay.io/ceph/ceph:v20, name=xenodochial_lewin, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:07:52 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 03 21:07:52 compute-0 systemd[1]: Started libpod-conmon-cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105.scope.
Dec 03 21:07:52 compute-0 podman[74315]: 2025-12-03 21:07:52.704631464 +0000 UTC m=+0.020866158 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:52 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:52 compute-0 podman[74315]: 2025-12-03 21:07:52.842929951 +0000 UTC m=+0.159164655 container init cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105 (image=quay.io/ceph/ceph:v20, name=xenodochial_lewin, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:07:52 compute-0 podman[74315]: 2025-12-03 21:07:52.855127846 +0000 UTC m=+0.171362520 container start cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105 (image=quay.io/ceph/ceph:v20, name=xenodochial_lewin, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default)
Dec 03 21:07:52 compute-0 podman[74315]: 2025-12-03 21:07:52.858984189 +0000 UTC m=+0.175218893 container attach cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105 (image=quay.io/ceph/ceph:v20, name=xenodochial_lewin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:07:52 compute-0 xenodochial_lewin[74331]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable)
Dec 03 21:07:52 compute-0 systemd[1]: libpod-cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105.scope: Deactivated successfully.
Dec 03 21:07:52 compute-0 podman[74315]: 2025-12-03 21:07:52.971356702 +0000 UTC m=+0.287591386 container died cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105 (image=quay.io/ceph/ceph:v20, name=xenodochial_lewin, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:07:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-6f4effdcdd993c630bc445662db3d2ed5919b08724f1a8ca24ff12ef9c1b4d48-merged.mount: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74315]: 2025-12-03 21:07:53.017489885 +0000 UTC m=+0.333724599 container remove cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105 (image=quay.io/ceph/ceph:v20, name=xenodochial_lewin, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:07:53 compute-0 systemd[1]: libpod-conmon-cfa3bef3b1357c74a02f70d5f0ed213e57889efee2675bc5df8b105852e29105.scope: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74350]: 2025-12-03 21:07:53.098067009 +0000 UTC m=+0.056926263 container create fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc (image=quay.io/ceph/ceph:v20, name=magical_lamport, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:07:53 compute-0 systemd[1]: Started libpod-conmon-fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc.scope.
Dec 03 21:07:53 compute-0 podman[74350]: 2025-12-03 21:07:53.067832541 +0000 UTC m=+0.026692245 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:53 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:53 compute-0 podman[74350]: 2025-12-03 21:07:53.203408634 +0000 UTC m=+0.162267938 container init fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc (image=quay.io/ceph/ceph:v20, name=magical_lamport, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:07:53 compute-0 podman[74350]: 2025-12-03 21:07:53.210546005 +0000 UTC m=+0.169405249 container start fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc (image=quay.io/ceph/ceph:v20, name=magical_lamport, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:07:53 compute-0 magical_lamport[74367]: 167 167
Dec 03 21:07:53 compute-0 podman[74350]: 2025-12-03 21:07:53.2148886 +0000 UTC m=+0.173747844 container attach fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc (image=quay.io/ceph/ceph:v20, name=magical_lamport, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:07:53 compute-0 systemd[1]: libpod-fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc.scope: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74350]: 2025-12-03 21:07:53.216672008 +0000 UTC m=+0.175531302 container died fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc (image=quay.io/ceph/ceph:v20, name=magical_lamport, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:07:53 compute-0 podman[74350]: 2025-12-03 21:07:53.26685819 +0000 UTC m=+0.225717444 container remove fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc (image=quay.io/ceph/ceph:v20, name=magical_lamport, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3)
Dec 03 21:07:53 compute-0 systemd[1]: libpod-conmon-fa13357bc690ccbccb112fb2a7760709bc0a0fc015ed78b296798422824b06fc.scope: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74386]: 2025-12-03 21:07:53.370779887 +0000 UTC m=+0.068474031 container create 01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39 (image=quay.io/ceph/ceph:v20, name=recursing_payne, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:07:53 compute-0 systemd[1]: Started libpod-conmon-01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39.scope.
Dec 03 21:07:53 compute-0 podman[74386]: 2025-12-03 21:07:53.341087123 +0000 UTC m=+0.038781317 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:53 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:53 compute-0 podman[74386]: 2025-12-03 21:07:53.456885748 +0000 UTC m=+0.154579952 container init 01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39 (image=quay.io/ceph/ceph:v20, name=recursing_payne, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:07:53 compute-0 podman[74386]: 2025-12-03 21:07:53.491743529 +0000 UTC m=+0.189437633 container start 01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39 (image=quay.io/ceph/ceph:v20, name=recursing_payne, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:07:53 compute-0 podman[74386]: 2025-12-03 21:07:53.49548206 +0000 UTC m=+0.193176174 container attach 01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39 (image=quay.io/ceph/ceph:v20, name=recursing_payne, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:07:53 compute-0 recursing_payne[74403]: AQCppjBpMJnQHhAAxGJU1UHRttnvhIEom6jHMg==
Dec 03 21:07:53 compute-0 systemd[1]: libpod-01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39.scope: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74386]: 2025-12-03 21:07:53.52093023 +0000 UTC m=+0.218624334 container died 01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39 (image=quay.io/ceph/ceph:v20, name=recursing_payne, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:07:53 compute-0 podman[74386]: 2025-12-03 21:07:53.561994677 +0000 UTC m=+0.259688781 container remove 01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39 (image=quay.io/ceph/ceph:v20, name=recursing_payne, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:07:53 compute-0 systemd[1]: libpod-conmon-01dc863d99e55dcf4c1a3bfd0cf11c9411dec70bd9d141c348567cf860218b39.scope: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74422]: 2025-12-03 21:07:53.653984755 +0000 UTC m=+0.060437465 container create fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644 (image=quay.io/ceph/ceph:v20, name=festive_tu, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:07:53 compute-0 systemd[1]: Started libpod-conmon-fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644.scope.
Dec 03 21:07:53 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:53 compute-0 podman[74422]: 2025-12-03 21:07:53.63319864 +0000 UTC m=+0.039651430 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:53 compute-0 podman[74422]: 2025-12-03 21:07:53.737172118 +0000 UTC m=+0.143624928 container init fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644 (image=quay.io/ceph/ceph:v20, name=festive_tu, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:07:53 compute-0 podman[74422]: 2025-12-03 21:07:53.743816466 +0000 UTC m=+0.150269216 container start fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644 (image=quay.io/ceph/ceph:v20, name=festive_tu, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:07:53 compute-0 podman[74422]: 2025-12-03 21:07:53.747892745 +0000 UTC m=+0.154345525 container attach fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644 (image=quay.io/ceph/ceph:v20, name=festive_tu, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default)
Dec 03 21:07:53 compute-0 festive_tu[74438]: AQCppjBpMrH/LRAA51zxlKZtaR3lOo3HPtc7fg==
Dec 03 21:07:53 compute-0 systemd[1]: libpod-fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644.scope: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74445]: 2025-12-03 21:07:53.838173448 +0000 UTC m=+0.043748211 container died fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644 (image=quay.io/ceph/ceph:v20, name=festive_tu, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:07:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-01e033bd4d9a27f01aa93a6be8b52359cae7bf9c9d7e9cb5b6e2ba45850209be-merged.mount: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74445]: 2025-12-03 21:07:53.875880085 +0000 UTC m=+0.081454858 container remove fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644 (image=quay.io/ceph/ceph:v20, name=festive_tu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 03 21:07:53 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:53 compute-0 systemd[1]: libpod-conmon-fb488507d3399c212fe8aa467a4a6b056db47d76762a6c0b3be6b38a00d62644.scope: Deactivated successfully.
Dec 03 21:07:53 compute-0 podman[74461]: 2025-12-03 21:07:53.951011003 +0000 UTC m=+0.045113497 container create b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9 (image=quay.io/ceph/ceph:v20, name=cool_lichterman, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:07:53 compute-0 systemd[1]: Started libpod-conmon-b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9.scope.
Dec 03 21:07:54 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:54 compute-0 podman[74461]: 2025-12-03 21:07:53.931645375 +0000 UTC m=+0.025747879 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:54 compute-0 podman[74461]: 2025-12-03 21:07:54.388078544 +0000 UTC m=+0.482181078 container init b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9 (image=quay.io/ceph/ceph:v20, name=cool_lichterman, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 03 21:07:54 compute-0 podman[74461]: 2025-12-03 21:07:54.394214728 +0000 UTC m=+0.488317222 container start b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9 (image=quay.io/ceph/ceph:v20, name=cool_lichterman, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 03 21:07:54 compute-0 cool_lichterman[74477]: AQCqpjBpRPTUGRAAGY4UVSICZn6CZFC4ph5YxQ==
Dec 03 21:07:54 compute-0 systemd[1]: libpod-b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9.scope: Deactivated successfully.
Dec 03 21:07:56 compute-0 podman[74461]: 2025-12-03 21:07:56.733067612 +0000 UTC m=+2.827170256 container attach b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9 (image=quay.io/ceph/ceph:v20, name=cool_lichterman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030)
Dec 03 21:07:56 compute-0 podman[74461]: 2025-12-03 21:07:56.734381807 +0000 UTC m=+2.828484361 container died b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9 (image=quay.io/ceph/ceph:v20, name=cool_lichterman, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:07:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-0ae07e9bb40ee33a26ff332833560dcaf2b247df09be2bfe7d50c65ca261152b-merged.mount: Deactivated successfully.
Dec 03 21:07:56 compute-0 podman[74461]: 2025-12-03 21:07:56.807358167 +0000 UTC m=+2.901460691 container remove b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9 (image=quay.io/ceph/ceph:v20, name=cool_lichterman, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:07:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:56 compute-0 systemd[1]: libpod-conmon-b0038074b4239792dbd827a80e7b440869b6614d013c8cb899510d25ae307ce9.scope: Deactivated successfully.
Dec 03 21:07:56 compute-0 podman[74495]: 2025-12-03 21:07:56.883622995 +0000 UTC m=+0.049829213 container create 10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55 (image=quay.io/ceph/ceph:v20, name=sharp_bouman, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:07:56 compute-0 systemd[1]: Started libpod-conmon-10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55.scope.
Dec 03 21:07:56 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/684d75e7ff4caf0205f7b7bf0d4ea157a61d960ebaf18a309166960bf14b3710/merged/tmp/monmap supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:56 compute-0 podman[74495]: 2025-12-03 21:07:56.860136658 +0000 UTC m=+0.026342876 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:56 compute-0 podman[74495]: 2025-12-03 21:07:56.956807611 +0000 UTC m=+0.123013869 container init 10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55 (image=quay.io/ceph/ceph:v20, name=sharp_bouman, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 03 21:07:56 compute-0 podman[74495]: 2025-12-03 21:07:56.964372603 +0000 UTC m=+0.130578791 container start 10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55 (image=quay.io/ceph/ceph:v20, name=sharp_bouman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:07:56 compute-0 podman[74495]: 2025-12-03 21:07:56.968970916 +0000 UTC m=+0.135177184 container attach 10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55 (image=quay.io/ceph/ceph:v20, name=sharp_bouman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:07:57 compute-0 sharp_bouman[74511]: /usr/bin/monmaptool: monmap file /tmp/monmap
Dec 03 21:07:57 compute-0 sharp_bouman[74511]: setting min_mon_release = tentacle
Dec 03 21:07:57 compute-0 sharp_bouman[74511]: /usr/bin/monmaptool: set fsid to c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:07:57 compute-0 sharp_bouman[74511]: /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors)
Dec 03 21:07:57 compute-0 systemd[1]: libpod-10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55.scope: Deactivated successfully.
Dec 03 21:07:57 compute-0 podman[74495]: 2025-12-03 21:07:57.004736552 +0000 UTC m=+0.170942770 container died 10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55 (image=quay.io/ceph/ceph:v20, name=sharp_bouman, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:07:57 compute-0 podman[74495]: 2025-12-03 21:07:57.057649446 +0000 UTC m=+0.223855664 container remove 10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55 (image=quay.io/ceph/ceph:v20, name=sharp_bouman, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:07:57 compute-0 systemd[1]: libpod-conmon-10eeea3b0bff7a8e1f23b523b5335e035569e5debedc2aa5ec1d5a58c1f43e55.scope: Deactivated successfully.
Dec 03 21:07:57 compute-0 podman[74532]: 2025-12-03 21:07:57.168131369 +0000 UTC m=+0.075303904 container create a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3 (image=quay.io/ceph/ceph:v20, name=goofy_varahamihira, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 03 21:07:57 compute-0 systemd[1]: Started libpod-conmon-a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3.scope.
Dec 03 21:07:57 compute-0 podman[74532]: 2025-12-03 21:07:57.145619966 +0000 UTC m=+0.052792521 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:57 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e842c6671a986ceed0e9b82a6663daa8c7a15b2c0791079f82602e3541dd5134/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e842c6671a986ceed0e9b82a6663daa8c7a15b2c0791079f82602e3541dd5134/merged/tmp/monmap supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e842c6671a986ceed0e9b82a6663daa8c7a15b2c0791079f82602e3541dd5134/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e842c6671a986ceed0e9b82a6663daa8c7a15b2c0791079f82602e3541dd5134/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:57 compute-0 podman[74532]: 2025-12-03 21:07:57.265784878 +0000 UTC m=+0.172957463 container init a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3 (image=quay.io/ceph/ceph:v20, name=goofy_varahamihira, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 03 21:07:57 compute-0 podman[74532]: 2025-12-03 21:07:57.281264262 +0000 UTC m=+0.188436797 container start a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3 (image=quay.io/ceph/ceph:v20, name=goofy_varahamihira, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, OSD_FLAVOR=default)
Dec 03 21:07:57 compute-0 podman[74532]: 2025-12-03 21:07:57.28641581 +0000 UTC m=+0.193588415 container attach a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3 (image=quay.io/ceph/ceph:v20, name=goofy_varahamihira, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:07:57 compute-0 systemd[1]: libpod-a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3.scope: Deactivated successfully.
Dec 03 21:07:57 compute-0 podman[74532]: 2025-12-03 21:07:57.388075676 +0000 UTC m=+0.295248271 container died a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3 (image=quay.io/ceph/ceph:v20, name=goofy_varahamihira, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:07:57 compute-0 podman[74532]: 2025-12-03 21:07:57.435017641 +0000 UTC m=+0.342190176 container remove a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3 (image=quay.io/ceph/ceph:v20, name=goofy_varahamihira, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:07:57 compute-0 systemd[1]: libpod-conmon-a3dbdf7d75406e2f404ab9f653a6b4d797b17d232374fedc67b0ed37273ff7d3.scope: Deactivated successfully.
Dec 03 21:07:57 compute-0 systemd[1]: Reloading.
Dec 03 21:07:57 compute-0 systemd-rc-local-generator[74617]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:07:57 compute-0 systemd-sysv-generator[74620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:07:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:57 compute-0 systemd[1]: Reloading.
Dec 03 21:07:57 compute-0 systemd-rc-local-generator[74651]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:07:57 compute-0 systemd-sysv-generator[74655]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:07:58 compute-0 systemd[1]: Reached target All Ceph clusters and services.
Dec 03 21:07:58 compute-0 systemd[1]: Reloading.
Dec 03 21:07:58 compute-0 systemd-rc-local-generator[74696]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:07:58 compute-0 systemd-sysv-generator[74699]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:07:58 compute-0 systemd[1]: Reached target Ceph cluster c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:07:58 compute-0 systemd[1]: Reloading.
Dec 03 21:07:58 compute-0 systemd-sysv-generator[74729]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:07:58 compute-0 systemd-rc-local-generator[74726]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:07:58 compute-0 systemd[1]: Reloading.
Dec 03 21:07:58 compute-0 systemd-rc-local-generator[74768]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:07:58 compute-0 systemd-sysv-generator[74772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:07:58 compute-0 systemd[1]: Created slice Slice /system/ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:07:58 compute-0 systemd[1]: Reached target System Time Set.
Dec 03 21:07:58 compute-0 systemd[1]: Reached target System Time Synchronized.
Dec 03 21:07:58 compute-0 systemd[1]: Starting Ceph mon.compute-0 for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:07:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:07:59 compute-0 podman[74830]: 2025-12-03 21:07:59.156067815 +0000 UTC m=+0.060193450 container create 21a850393ffc3b228a0c2499d1dbfa6a53893dec4078b9c095d7b72c6199ac2e (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030)
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7826f93da93a73ed4e24bc845da7f990ade20e4d48691fc3ded33160f103e6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7826f93da93a73ed4e24bc845da7f990ade20e4d48691fc3ded33160f103e6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7826f93da93a73ed4e24bc845da7f990ade20e4d48691fc3ded33160f103e6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 podman[74830]: 2025-12-03 21:07:59.126192166 +0000 UTC m=+0.030317871 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7826f93da93a73ed4e24bc845da7f990ade20e4d48691fc3ded33160f103e6/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 podman[74830]: 2025-12-03 21:07:59.234982053 +0000 UTC m=+0.139107748 container init 21a850393ffc3b228a0c2499d1dbfa6a53893dec4078b9c095d7b72c6199ac2e (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:07:59 compute-0 podman[74830]: 2025-12-03 21:07:59.246979095 +0000 UTC m=+0.151104720 container start 21a850393ffc3b228a0c2499d1dbfa6a53893dec4078b9c095d7b72c6199ac2e (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:07:59 compute-0 bash[74830]: 21a850393ffc3b228a0c2499d1dbfa6a53893dec4078b9c095d7b72c6199ac2e
Dec 03 21:07:59 compute-0 systemd[1]: Started Ceph mon.compute-0 for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:07:59 compute-0 ceph-mon[74850]: set uid:gid to 167:167 (ceph:ceph)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-mon, pid 2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: pidfile_write: ignore empty --pid-file
Dec 03 21:07:59 compute-0 ceph-mon[74850]: load: jerasure load: lrc 
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: RocksDB version: 7.9.2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Git sha 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Compile date 2025-10-30 15:42:43
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: DB SUMMARY
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: DB Session ID:  7J40V8I7ABGAZ2BL57XQ
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: CURRENT file:  CURRENT
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: IDENTITY file:  IDENTITY
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-0/store.db dir, Total Num: 0, files: 
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-0/store.db: 000004.log size: 807 ; 
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                         Options.error_if_exists: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                       Options.create_if_missing: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                         Options.paranoid_checks: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                                     Options.env: 0x55a3e0832440
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                                Options.info_log: 0x55a3e18613e0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.max_file_opening_threads: 16
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                              Options.statistics: (nil)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                               Options.use_fsync: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                       Options.max_log_file_size: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                         Options.allow_fallocate: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                        Options.use_direct_reads: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:          Options.create_missing_column_families: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                              Options.db_log_dir: 
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                                 Options.wal_dir: 
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                   Options.advise_random_on_open: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                    Options.write_buffer_manager: 0x55a3e17e0140
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                            Options.rate_limiter: (nil)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.unordered_write: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                               Options.row_cache: None
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                              Options.wal_filter: None
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.allow_ingest_behind: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.two_write_queues: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.manual_wal_flush: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.wal_compression: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.atomic_flush: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                 Options.log_readahead_size: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.allow_data_in_errors: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.db_host_id: __hostname__
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.max_background_jobs: 2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.max_background_compactions: -1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.max_subcompactions: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.max_total_wal_size: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                          Options.max_open_files: -1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                          Options.bytes_per_sync: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:       Options.compaction_readahead_size: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.max_background_flushes: -1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Compression algorithms supported:
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         kZSTD supported: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         kXpressCompression supported: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         kBZip2Compression supported: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         kLZ4Compression supported: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         kZlibCompression supported: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         kLZ4HCCompression supported: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         kSnappyCompression supported: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000005
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:           Options.merge_operator: 
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:        Options.compaction_filter: None
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a3e17ec600)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a3e17d18d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:        Options.write_buffer_size: 33554432
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:  Options.max_write_buffer_number: 2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:          Options.compression: NoCompression
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.num_levels: 7
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d83d641b-0db7-44b5-9540-349f4c36f664
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796079329404, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796079332085, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1944, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 819, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 696, "raw_average_value_size": 139, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "7J40V8I7ABGAZ2BL57XQ", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796079332369, "job": 1, "event": "recovery_finished"}
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55a3e17fee00
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: DB pointer 0x55a3e194a000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:07:59 compute-0 ceph-mon[74850]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.90 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.90 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a3e17d18d0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,0.95 KB,0.000181794%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 03 21:07:59 compute-0 ceph-mon[74850]: starting mon.compute-0 rank 0 at public addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] at bind addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-0 fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@-1(???) e0 preinit fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@-1(probing) e0  my rank is now 0 (was -1)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(probing) e0 win_standalone_election
Dec 03 21:07:59 compute-0 ceph-mon[74850]: paxos.0).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 03 21:07:59 compute-0 podman[74851]: 2025-12-03 21:07:59.348741884 +0000 UTC m=+0.059352447 container create 443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2 (image=quay.io/ceph/ceph:v20, name=affectionate_goldberg, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(electing) e0 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 create_pending setting backfillfull_ratio = 0.9
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 create_pending setting full_ratio = 0.95
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 create_pending setting nearfull_ratio = 0.85
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 do_prune osdmap full prune enabled
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader) e0 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).paxosservice(auth 0..0) refresh upgraded, format 3 -> 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(probing) e1 win_standalone_election
Dec 03 21:07:59 compute-0 ceph-mon[74850]: paxos.0).electionLogic(2) init, last seen epoch 2
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(electing) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : monmap epoch 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : last_changed 2025-12-03T21:07:57.000116+0000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : created 2025-12-03T21:07:57.000116+0000
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : min_mon_release 20 (tentacle)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : election_strategy: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mgrc update_daemon_metadata mon.compute-0 metadata {addrs=[v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0],arch=x86_64,ceph_release=tentacle,ceph_version=ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo),ceph_version_short=20.2.0,ceph_version_when_created=ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-0,container_image=quay.io/ceph/ceph:v20,cpu=AMD EPYC-Rome Processor,created_at=2025-12-03T21:07:57.329036Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-0,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025,kernel_version=5.14.0-645.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 create_pending setting backfillfull_ratio = 0.9
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 create_pending setting full_ratio = 0.95
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 create_pending setting nearfull_ratio = 0.85
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 do_prune osdmap full prune enabled
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader) e1 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout,17=tentacle ondisk layout}
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).mds e1 new map
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).mds e1 print_map
                                           e1
                                           btime 2025-12-03T21:07:59:373870+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).paxosservice(auth 0..0) refresh upgraded, format 3 -> 0
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : fsmap 
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e1 e1: 0 total, 0 up, 0 in
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e1 crush map has features 3314932999778484224, adjusting msgr requires
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mkfs c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader).paxosservice(auth 1..1) refresh upgraded, format 0 -> 3
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : osdmap e1: 0 total, 0 up, 0 in
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(cluster) log [DBG] : mgrmap e1: no daemons active
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 03 21:07:59 compute-0 systemd[1]: Started libpod-conmon-443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2.scope.
Dec 03 21:07:59 compute-0 podman[74851]: 2025-12-03 21:07:59.319677127 +0000 UTC m=+0.030287710 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:59 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a07e0ed30aed43b3ff86a38927958aa56606b7e5df55422323d3738356e3ed85/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a07e0ed30aed43b3ff86a38927958aa56606b7e5df55422323d3738356e3ed85/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a07e0ed30aed43b3ff86a38927958aa56606b7e5df55422323d3738356e3ed85/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 podman[74851]: 2025-12-03 21:07:59.474895035 +0000 UTC m=+0.185505608 container init 443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2 (image=quay.io/ceph/ceph:v20, name=affectionate_goldberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:07:59 compute-0 podman[74851]: 2025-12-03 21:07:59.486032923 +0000 UTC m=+0.196643456 container start 443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2 (image=quay.io/ceph/ceph:v20, name=affectionate_goldberg, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:07:59 compute-0 podman[74851]: 2025-12-03 21:07:59.489506395 +0000 UTC m=+0.200116928 container attach 443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2 (image=quay.io/ceph/ceph:v20, name=affectionate_goldberg, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0)
Dec 03 21:07:59 compute-0 ceph-mon[74850]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1609393615' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:   cluster:
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     id:     c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     health: HEALTH_OK
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:  
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:   services:
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     mon: 1 daemons, quorum compute-0 (age 0.319761s) [leader: compute-0]
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     mgr: no daemons active
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     osd: 0 osds: 0 up, 0 in
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:  
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:   data:
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     pools:   0 pools, 0 pgs
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     objects: 0 objects, 0 B
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     usage:   0 B used, 0 B / 0 B avail
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:     pgs:     
Dec 03 21:07:59 compute-0 affectionate_goldberg[74905]:  
Dec 03 21:07:59 compute-0 systemd[1]: libpod-443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2.scope: Deactivated successfully.
Dec 03 21:07:59 compute-0 podman[74851]: 2025-12-03 21:07:59.707510722 +0000 UTC m=+0.418121345 container died 443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2 (image=quay.io/ceph/ceph:v20, name=affectionate_goldberg, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:07:59 compute-0 podman[74851]: 2025-12-03 21:07:59.763801466 +0000 UTC m=+0.474412019 container remove 443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2 (image=quay.io/ceph/ceph:v20, name=affectionate_goldberg, CEPH_REF=tentacle, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:07:59 compute-0 systemd[1]: libpod-conmon-443a0578fc77013b272b08bbc8d47f78ace0a0e0c715b75560c7398f20f73ae2.scope: Deactivated successfully.
Dec 03 21:07:59 compute-0 podman[74945]: 2025-12-03 21:07:59.843710362 +0000 UTC m=+0.055920926 container create cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396 (image=quay.io/ceph/ceph:v20, name=jovial_cray, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 21:07:59 compute-0 systemd[1]: Started libpod-conmon-cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396.scope.
Dec 03 21:07:59 compute-0 podman[74945]: 2025-12-03 21:07:59.814012388 +0000 UTC m=+0.026222992 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:07:59 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a8bcaaac490cf82902f677024e14ca4e5bad549716722ccae9aaae4130eed26/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a8bcaaac490cf82902f677024e14ca4e5bad549716722ccae9aaae4130eed26/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a8bcaaac490cf82902f677024e14ca4e5bad549716722ccae9aaae4130eed26/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a8bcaaac490cf82902f677024e14ca4e5bad549716722ccae9aaae4130eed26/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:07:59 compute-0 podman[74945]: 2025-12-03 21:07:59.93381528 +0000 UTC m=+0.146025884 container init cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396 (image=quay.io/ceph/ceph:v20, name=jovial_cray, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:07:59 compute-0 podman[74945]: 2025-12-03 21:07:59.941720801 +0000 UTC m=+0.153931345 container start cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396 (image=quay.io/ceph/ceph:v20, name=jovial_cray, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:07:59 compute-0 podman[74945]: 2025-12-03 21:07:59.945945203 +0000 UTC m=+0.158155817 container attach cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396 (image=quay.io/ceph/ceph:v20, name=jovial_cray, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:00 compute-0 ceph-mon[74850]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config assimilate-conf"} v 0)
Dec 03 21:08:00 compute-0 ceph-mon[74850]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1150063760' entity='client.admin' cmd={"prefix": "config assimilate-conf"} : dispatch
Dec 03 21:08:00 compute-0 ceph-mon[74850]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1150063760' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Dec 03 21:08:00 compute-0 jovial_cray[74962]: 
Dec 03 21:08:00 compute-0 jovial_cray[74962]: [global]
Dec 03 21:08:00 compute-0 jovial_cray[74962]:         fsid = c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:00 compute-0 jovial_cray[74962]:         mon_host = [v2:192.168.122.100:3300,v1:192.168.122.100:6789]
Dec 03 21:08:00 compute-0 jovial_cray[74962]:         osd_crush_chooseleaf_type = 0
Dec 03 21:08:00 compute-0 systemd[1]: libpod-cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396.scope: Deactivated successfully.
Dec 03 21:08:00 compute-0 podman[74945]: 2025-12-03 21:08:00.183881513 +0000 UTC m=+0.396092037 container died cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396 (image=quay.io/ceph/ceph:v20, name=jovial_cray, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:08:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-5a8bcaaac490cf82902f677024e14ca4e5bad549716722ccae9aaae4130eed26-merged.mount: Deactivated successfully.
Dec 03 21:08:00 compute-0 podman[74945]: 2025-12-03 21:08:00.231377609 +0000 UTC m=+0.443588133 container remove cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396 (image=quay.io/ceph/ceph:v20, name=jovial_cray, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:08:00 compute-0 systemd[1]: libpod-conmon-cff5dfb52fb6f6e4ae48eb94fe0f9ea460e58eb524c4126b7780c52e336fd396.scope: Deactivated successfully.
Dec 03 21:08:00 compute-0 podman[75000]: 2025-12-03 21:08:00.326671888 +0000 UTC m=+0.065684407 container create e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a (image=quay.io/ceph/ceph:v20, name=peaceful_bassi, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:08:00 compute-0 systemd[1]: Started libpod-conmon-e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a.scope.
Dec 03 21:08:00 compute-0 podman[75000]: 2025-12-03 21:08:00.298856409 +0000 UTC m=+0.037868998 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:00 compute-0 ceph-mon[74850]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 03 21:08:00 compute-0 ceph-mon[74850]: monmap epoch 1
Dec 03 21:08:00 compute-0 ceph-mon[74850]: fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:00 compute-0 ceph-mon[74850]: last_changed 2025-12-03T21:07:57.000116+0000
Dec 03 21:08:00 compute-0 ceph-mon[74850]: created 2025-12-03T21:07:57.000116+0000
Dec 03 21:08:00 compute-0 ceph-mon[74850]: min_mon_release 20 (tentacle)
Dec 03 21:08:00 compute-0 ceph-mon[74850]: election_strategy: 1
Dec 03 21:08:00 compute-0 ceph-mon[74850]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Dec 03 21:08:00 compute-0 ceph-mon[74850]: fsmap 
Dec 03 21:08:00 compute-0 ceph-mon[74850]: osdmap e1: 0 total, 0 up, 0 in
Dec 03 21:08:00 compute-0 ceph-mon[74850]: mgrmap e1: no daemons active
Dec 03 21:08:00 compute-0 ceph-mon[74850]: from='client.? 192.168.122.100:0/1609393615' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 03 21:08:00 compute-0 ceph-mon[74850]: from='client.? 192.168.122.100:0/1150063760' entity='client.admin' cmd={"prefix": "config assimilate-conf"} : dispatch
Dec 03 21:08:00 compute-0 ceph-mon[74850]: from='client.? 192.168.122.100:0/1150063760' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Dec 03 21:08:00 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/894bdc412a29c6427fbcfac0856f1df5e1e8e096d47f27d60b2c0d6ef8766c44/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/894bdc412a29c6427fbcfac0856f1df5e1e8e096d47f27d60b2c0d6ef8766c44/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/894bdc412a29c6427fbcfac0856f1df5e1e8e096d47f27d60b2c0d6ef8766c44/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/894bdc412a29c6427fbcfac0856f1df5e1e8e096d47f27d60b2c0d6ef8766c44/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:00 compute-0 podman[75000]: 2025-12-03 21:08:00.447175161 +0000 UTC m=+0.186187680 container init e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a (image=quay.io/ceph/ceph:v20, name=peaceful_bassi, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:08:00 compute-0 podman[75000]: 2025-12-03 21:08:00.454310758 +0000 UTC m=+0.193323247 container start e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a (image=quay.io/ceph/ceph:v20, name=peaceful_bassi, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:08:00 compute-0 podman[75000]: 2025-12-03 21:08:00.458488551 +0000 UTC m=+0.197501080 container attach e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a (image=quay.io/ceph/ceph:v20, name=peaceful_bassi, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3)
Dec 03 21:08:00 compute-0 ceph-mon[74850]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:00 compute-0 ceph-mon[74850]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3461721600' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:00 compute-0 systemd[1]: libpod-e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a.scope: Deactivated successfully.
Dec 03 21:08:00 compute-0 podman[75000]: 2025-12-03 21:08:00.705133647 +0000 UTC m=+0.444146156 container died e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a (image=quay.io/ceph/ceph:v20, name=peaceful_bassi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.build-date=20251030)
Dec 03 21:08:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-894bdc412a29c6427fbcfac0856f1df5e1e8e096d47f27d60b2c0d6ef8766c44-merged.mount: Deactivated successfully.
Dec 03 21:08:00 compute-0 podman[75000]: 2025-12-03 21:08:00.761258556 +0000 UTC m=+0.500271075 container remove e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a (image=quay.io/ceph/ceph:v20, name=peaceful_bassi, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:08:00 compute-0 systemd[1]: libpod-conmon-e1ac53277320eccead34320e1185abc52d95db8738d29e9a70590b0a79ade31a.scope: Deactivated successfully.
Dec 03 21:08:00 compute-0 systemd[1]: Stopping Ceph mon.compute-0 for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:08:01 compute-0 ceph-mon[74850]: received  signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.compute-0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false  (PID: 1) UID: 0
Dec 03 21:08:01 compute-0 ceph-mon[74850]: mon.compute-0@0(leader) e1 *** Got Signal Terminated ***
Dec 03 21:08:01 compute-0 ceph-mon[74850]: mon.compute-0@0(leader) e1 shutdown
Dec 03 21:08:01 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0[74846]: 2025-12-03T21:08:01.046+0000 7f3e499eb640 -1 received  signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.compute-0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false  (PID: 1) UID: 0
Dec 03 21:08:01 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0[74846]: 2025-12-03T21:08:01.046+0000 7f3e499eb640 -1 mon.compute-0@0(leader) e1 *** Got Signal Terminated ***
Dec 03 21:08:01 compute-0 ceph-mon[74850]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 03 21:08:01 compute-0 ceph-mon[74850]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 03 21:08:01 compute-0 podman[75082]: 2025-12-03 21:08:01.077086565 +0000 UTC m=+0.083223481 container died 21a850393ffc3b228a0c2499d1dbfa6a53893dec4078b9c095d7b72c6199ac2e (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-ab7826f93da93a73ed4e24bc845da7f990ade20e4d48691fc3ded33160f103e6-merged.mount: Deactivated successfully.
Dec 03 21:08:01 compute-0 podman[75082]: 2025-12-03 21:08:01.113866485 +0000 UTC m=+0.120003431 container remove 21a850393ffc3b228a0c2499d1dbfa6a53893dec4078b9c095d7b72c6199ac2e (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:01 compute-0 bash[75082]: ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0
Dec 03 21:08:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:08:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 03 21:08:01 compute-0 systemd[1]: ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@mon.compute-0.service: Deactivated successfully.
Dec 03 21:08:01 compute-0 systemd[1]: Stopped Ceph mon.compute-0 for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:08:01 compute-0 systemd[1]: ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@mon.compute-0.service: Consumed 1.127s CPU time.
Dec 03 21:08:01 compute-0 systemd[1]: Starting Ceph mon.compute-0 for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:08:01 compute-0 podman[75184]: 2025-12-03 21:08:01.540826085 +0000 UTC m=+0.054105040 container create 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e130db53cb4ce5b03dc6db20b7b7e14193120feb03a76fb8dd45110fe5503780/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e130db53cb4ce5b03dc6db20b7b7e14193120feb03a76fb8dd45110fe5503780/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e130db53cb4ce5b03dc6db20b7b7e14193120feb03a76fb8dd45110fe5503780/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e130db53cb4ce5b03dc6db20b7b7e14193120feb03a76fb8dd45110fe5503780/merged/var/lib/ceph/mon/ceph-compute-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:01 compute-0 podman[75184]: 2025-12-03 21:08:01.509561391 +0000 UTC m=+0.022840326 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:01 compute-0 podman[75184]: 2025-12-03 21:08:01.615530854 +0000 UTC m=+0.128809819 container init 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Dec 03 21:08:01 compute-0 podman[75184]: 2025-12-03 21:08:01.633655204 +0000 UTC m=+0.146934129 container start 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True)
Dec 03 21:08:01 compute-0 bash[75184]: 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916
Dec 03 21:08:01 compute-0 systemd[1]: Started Ceph mon.compute-0 for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:08:01 compute-0 ceph-mon[75204]: set uid:gid to 167:167 (ceph:ceph)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-mon, pid 2
Dec 03 21:08:01 compute-0 ceph-mon[75204]: pidfile_write: ignore empty --pid-file
Dec 03 21:08:01 compute-0 ceph-mon[75204]: load: jerasure load: lrc 
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: RocksDB version: 7.9.2
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Git sha 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Compile date 2025-10-30 15:42:43
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: DB SUMMARY
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: DB Session ID:  YRQHTOJ9E78VAMDNI6U1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: CURRENT file:  CURRENT
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: IDENTITY file:  IDENTITY
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: MANIFEST file:  MANIFEST-000010 size: 179 Bytes
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-0/store.db dir, Total Num: 1, files: 000008.sst 
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-0/store.db: 000009.log size: 60239 ; 
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                         Options.error_if_exists: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                       Options.create_if_missing: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                         Options.paranoid_checks: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                                     Options.env: 0x56170b774440
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                                Options.info_log: 0x56170d6a7e80
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.max_file_opening_threads: 16
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                              Options.statistics: (nil)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                               Options.use_fsync: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                       Options.max_log_file_size: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                         Options.allow_fallocate: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                        Options.use_direct_reads: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:          Options.create_missing_column_families: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                              Options.db_log_dir: 
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                                 Options.wal_dir: 
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                   Options.advise_random_on_open: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                    Options.write_buffer_manager: 0x56170d6f2140
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                            Options.rate_limiter: (nil)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.unordered_write: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                               Options.row_cache: None
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                              Options.wal_filter: None
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.allow_ingest_behind: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.two_write_queues: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.manual_wal_flush: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.wal_compression: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.atomic_flush: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                 Options.log_readahead_size: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.allow_data_in_errors: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.db_host_id: __hostname__
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.max_background_jobs: 2
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.max_background_compactions: -1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.max_subcompactions: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.max_total_wal_size: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                          Options.max_open_files: -1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                          Options.bytes_per_sync: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:       Options.compaction_readahead_size: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.max_background_flushes: -1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Compression algorithms supported:
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         kZSTD supported: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         kXpressCompression supported: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         kBZip2Compression supported: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         kLZ4Compression supported: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         kZlibCompression supported: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         kLZ4HCCompression supported: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         kSnappyCompression supported: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000010
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:           Options.merge_operator: 
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:        Options.compaction_filter: None
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56170d6fea00)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x56170d6e38d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:        Options.write_buffer_size: 33554432
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:  Options.max_write_buffer_number: 2
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:          Options.compression: NoCompression
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.num_levels: 7
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-0/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d83d641b-0db7-44b5-9540-349f4c36f664
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796081688909, "job": 1, "event": "recovery_started", "wal_files": [9]}
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796081694786, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 59960, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 143, "table_properties": {"data_size": 58438, "index_size": 164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 325, "raw_key_size": 3403, "raw_average_key_size": 30, "raw_value_size": 55790, "raw_average_value_size": 507, "num_data_blocks": 9, "num_entries": 110, "num_filter_entries": 110, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796081, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796081694915, "job": 1, "event": "recovery_finished"}
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:5047] Creating manifest 15
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x56170d710e00
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: DB pointer 0x56170d85a000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:08:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0   60.45 KB   0.5      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     10.4      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Sum      2/0   60.45 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     10.4      0.01              0.00         1    0.005       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     10.4      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     10.4      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 3.15 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 3.15 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56170d6e38d0#2 capacity: 512.00 MB usage: 0.84 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 5.6e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(2,0.48 KB,9.23872e-05%) IndexBlock(2,0.36 KB,6.85453e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 03 21:08:01 compute-0 ceph-mon[75204]: starting mon.compute-0 rank 0 at public addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] at bind addrs [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-0 fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(???) e1 preinit fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(???).mds e1 new map
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(???).mds e1 print_map
                                           e1
                                           btime 2025-12-03T21:07:59:373870+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(???).osd e1 crush map has features 3314932999778484224, adjusting msgr requires
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(???).osd e1 crush map has features 288514050185494528, adjusting msgr requires
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(???).paxosservice(auth 1..2) refresh upgraded, format 0 -> 3
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@-1(probing) e1  my rank is now 0 (was -1)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@0(probing) e1 win_standalone_election
Dec 03 21:08:01 compute-0 ceph-mon[75204]: paxos.0).electionLogic(3) init, last seen epoch 3, mid-election, bumping
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@0(electing) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : monmap epoch 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : last_changed 2025-12-03T21:07:57.000116+0000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : created 2025-12-03T21:07:57.000116+0000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : min_mon_release 20 (tentacle)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : election_strategy: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : fsmap 
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e1: 0 total, 0 up, 0 in
Dec 03 21:08:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e1: no daemons active
Dec 03 21:08:01 compute-0 podman[75205]: 2025-12-03 21:08:01.748780153 +0000 UTC m=+0.068399014 container create aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88 (image=quay.io/ceph/ceph:v20, name=infallible_khayyam, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mon.compute-0 is new leader, mons compute-0 in quorum (ranks 0)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: monmap epoch 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:01 compute-0 ceph-mon[75204]: last_changed 2025-12-03T21:07:57.000116+0000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: created 2025-12-03T21:07:57.000116+0000
Dec 03 21:08:01 compute-0 ceph-mon[75204]: min_mon_release 20 (tentacle)
Dec 03 21:08:01 compute-0 ceph-mon[75204]: election_strategy: 1
Dec 03 21:08:01 compute-0 ceph-mon[75204]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Dec 03 21:08:01 compute-0 ceph-mon[75204]: fsmap 
Dec 03 21:08:01 compute-0 ceph-mon[75204]: osdmap e1: 0 total, 0 up, 0 in
Dec 03 21:08:01 compute-0 ceph-mon[75204]: mgrmap e1: no daemons active
Dec 03 21:08:01 compute-0 systemd[1]: Started libpod-conmon-aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88.scope.
Dec 03 21:08:01 compute-0 podman[75205]: 2025-12-03 21:08:01.723810995 +0000 UTC m=+0.043429936 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:01 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/583dae49ac19e390d41d20e6091bd1ca13dcaaae1a0443052ea6247447b623c6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/583dae49ac19e390d41d20e6091bd1ca13dcaaae1a0443052ea6247447b623c6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/583dae49ac19e390d41d20e6091bd1ca13dcaaae1a0443052ea6247447b623c6/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:01 compute-0 podman[75205]: 2025-12-03 21:08:01.865795389 +0000 UTC m=+0.185414340 container init aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88 (image=quay.io/ceph/ceph:v20, name=infallible_khayyam, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:08:01 compute-0 podman[75205]: 2025-12-03 21:08:01.878122875 +0000 UTC m=+0.197741766 container start aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88 (image=quay.io/ceph/ceph:v20, name=infallible_khayyam, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:01 compute-0 podman[75205]: 2025-12-03 21:08:01.882823122 +0000 UTC m=+0.202442013 container attach aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88 (image=quay.io/ceph/ceph:v20, name=infallible_khayyam, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:08:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=public_network}] v 0)
Dec 03 21:08:02 compute-0 systemd[1]: libpod-aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88.scope: Deactivated successfully.
Dec 03 21:08:02 compute-0 podman[75287]: 2025-12-03 21:08:02.218922031 +0000 UTC m=+0.040149214 container died aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88 (image=quay.io/ceph/ceph:v20, name=infallible_khayyam, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-583dae49ac19e390d41d20e6091bd1ca13dcaaae1a0443052ea6247447b623c6-merged.mount: Deactivated successfully.
Dec 03 21:08:02 compute-0 podman[75287]: 2025-12-03 21:08:02.268648383 +0000 UTC m=+0.089875516 container remove aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88 (image=quay.io/ceph/ceph:v20, name=infallible_khayyam, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3)
Dec 03 21:08:02 compute-0 systemd[1]: libpod-conmon-aacd22b91a099592e8cc13926950d97b036e870949e234f6daa4018351940b88.scope: Deactivated successfully.
Dec 03 21:08:02 compute-0 podman[75302]: 2025-12-03 21:08:02.38568804 +0000 UTC m=+0.072937606 container create 8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232 (image=quay.io/ceph/ceph:v20, name=blissful_austin, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:08:02 compute-0 systemd[1]: Started libpod-conmon-8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232.scope.
Dec 03 21:08:02 compute-0 podman[75302]: 2025-12-03 21:08:02.353297098 +0000 UTC m=+0.040546714 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:02 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db6192df3ca52717e3c7f28e5e3492884e9412372b30d119b45894d8285d9772/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db6192df3ca52717e3c7f28e5e3492884e9412372b30d119b45894d8285d9772/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db6192df3ca52717e3c7f28e5e3492884e9412372b30d119b45894d8285d9772/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:02 compute-0 podman[75302]: 2025-12-03 21:08:02.481282356 +0000 UTC m=+0.168531982 container init 8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232 (image=quay.io/ceph/ceph:v20, name=blissful_austin, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:08:02 compute-0 podman[75302]: 2025-12-03 21:08:02.491193772 +0000 UTC m=+0.178443338 container start 8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232 (image=quay.io/ceph/ceph:v20, name=blissful_austin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:08:02 compute-0 podman[75302]: 2025-12-03 21:08:02.496488003 +0000 UTC m=+0.183737569 container attach 8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232 (image=quay.io/ceph/ceph:v20, name=blissful_austin, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:08:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=cluster_network}] v 0)
Dec 03 21:08:02 compute-0 systemd[1]: libpod-8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232.scope: Deactivated successfully.
Dec 03 21:08:02 compute-0 podman[75302]: 2025-12-03 21:08:02.750246065 +0000 UTC m=+0.437495601 container died 8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232 (image=quay.io/ceph/ceph:v20, name=blissful_austin, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-db6192df3ca52717e3c7f28e5e3492884e9412372b30d119b45894d8285d9772-merged.mount: Deactivated successfully.
Dec 03 21:08:02 compute-0 podman[75302]: 2025-12-03 21:08:02.790483111 +0000 UTC m=+0.477732657 container remove 8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232 (image=quay.io/ceph/ceph:v20, name=blissful_austin, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:02 compute-0 systemd[1]: libpod-conmon-8bdbfc7f531e0d29fe5fb445987d6cd3ba91162bbd0acc9b5b6a0cad4fa76232.scope: Deactivated successfully.
Dec 03 21:08:02 compute-0 systemd[1]: Reloading.
Dec 03 21:08:02 compute-0 systemd-sysv-generator[75383]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:08:02 compute-0 systemd-rc-local-generator[75380]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:08:03 compute-0 systemd[1]: Reloading.
Dec 03 21:08:03 compute-0 systemd-sysv-generator[75421]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:08:03 compute-0 systemd-rc-local-generator[75418]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:08:03 compute-0 systemd[1]: Starting Ceph mgr.compute-0.jxauqt for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:08:03 compute-0 podman[75481]: 2025-12-03 21:08:03.655194037 +0000 UTC m=+0.058078139 container create 3ad5fa1a42ade6349b66920b152c5e56f1754f06ff5829bb4056fc775998f6a5 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0)
Dec 03 21:08:03 compute-0 podman[75481]: 2025-12-03 21:08:03.627622525 +0000 UTC m=+0.030506717 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6ad6dc267861ffde8868c0189956fb90082124991312250387d49919eba603a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6ad6dc267861ffde8868c0189956fb90082124991312250387d49919eba603a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6ad6dc267861ffde8868c0189956fb90082124991312250387d49919eba603a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6ad6dc267861ffde8868c0189956fb90082124991312250387d49919eba603a/merged/var/lib/ceph/mgr/ceph-compute-0.jxauqt supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:03 compute-0 podman[75481]: 2025-12-03 21:08:03.74131914 +0000 UTC m=+0.144203312 container init 3ad5fa1a42ade6349b66920b152c5e56f1754f06ff5829bb4056fc775998f6a5 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:03 compute-0 podman[75481]: 2025-12-03 21:08:03.750474996 +0000 UTC m=+0.153359138 container start 3ad5fa1a42ade6349b66920b152c5e56f1754f06ff5829bb4056fc775998f6a5 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:03 compute-0 bash[75481]: 3ad5fa1a42ade6349b66920b152c5e56f1754f06ff5829bb4056fc775998f6a5
Dec 03 21:08:03 compute-0 systemd[1]: Started Ceph mgr.compute-0.jxauqt for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:08:03 compute-0 ceph-mgr[75500]: set uid:gid to 167:167 (ceph:ceph)
Dec 03 21:08:03 compute-0 ceph-mgr[75500]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-mgr, pid 2
Dec 03 21:08:03 compute-0 ceph-mgr[75500]: pidfile_write: ignore empty --pid-file
Dec 03 21:08:03 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'alerts'
Dec 03 21:08:03 compute-0 podman[75501]: 2025-12-03 21:08:03.890065992 +0000 UTC m=+0.085410385 container create b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd (image=quay.io/ceph/ceph:v20, name=frosty_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:08:03 compute-0 systemd[1]: Started libpod-conmon-b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd.scope.
Dec 03 21:08:03 compute-0 podman[75501]: 2025-12-03 21:08:03.862051408 +0000 UTC m=+0.057395881 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:03 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'balancer'
Dec 03 21:08:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ca44ce071279ffe8e8f3003bb58f867241c249356a01a566add5026a2cf113c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ca44ce071279ffe8e8f3003bb58f867241c249356a01a566add5026a2cf113c/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ca44ce071279ffe8e8f3003bb58f867241c249356a01a566add5026a2cf113c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:04 compute-0 podman[75501]: 2025-12-03 21:08:04.007824657 +0000 UTC m=+0.203169130 container init b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd (image=quay.io/ceph/ceph:v20, name=frosty_kirch, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:04 compute-0 podman[75501]: 2025-12-03 21:08:04.019985098 +0000 UTC m=+0.215329521 container start b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd (image=quay.io/ceph/ceph:v20, name=frosty_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.license=GPLv2)
Dec 03 21:08:04 compute-0 podman[75501]: 2025-12-03 21:08:04.024616233 +0000 UTC m=+0.219960656 container attach b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd (image=quay.io/ceph/ceph:v20, name=frosty_kirch, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:04 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'cephadm'
Dec 03 21:08:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec 03 21:08:04 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/121509719' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:08:04 compute-0 frosty_kirch[75538]: 
Dec 03 21:08:04 compute-0 frosty_kirch[75538]: {
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "health": {
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "status": "HEALTH_OK",
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "checks": {},
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "mutes": []
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     },
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "election_epoch": 5,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "quorum": [
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         0
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     ],
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "quorum_names": [
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "compute-0"
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     ],
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "quorum_age": 2,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "monmap": {
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "epoch": 1,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "min_mon_release_name": "tentacle",
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_mons": 1
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     },
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "osdmap": {
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "epoch": 1,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_osds": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_up_osds": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "osd_up_since": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_in_osds": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "osd_in_since": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_remapped_pgs": 0
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     },
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "pgmap": {
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "pgs_by_state": [],
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_pgs": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_pools": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_objects": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "data_bytes": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "bytes_used": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "bytes_avail": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "bytes_total": 0
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     },
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "fsmap": {
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "epoch": 1,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "btime": "2025-12-03T21:07:59:373870+0000",
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "by_rank": [],
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "up:standby": 0
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     },
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "mgrmap": {
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "available": false,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "num_standbys": 0,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "modules": [
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:             "iostat",
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:             "nfs"
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         ],
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "services": {}
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     },
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "servicemap": {
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "epoch": 1,
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "modified": "2025-12-03T21:07:59.377140+0000",
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:         "services": {}
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     },
Dec 03 21:08:04 compute-0 frosty_kirch[75538]:     "progress_events": {}
Dec 03 21:08:04 compute-0 frosty_kirch[75538]: }
Dec 03 21:08:04 compute-0 systemd[1]: libpod-b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd.scope: Deactivated successfully.
Dec 03 21:08:04 compute-0 podman[75501]: 2025-12-03 21:08:04.255823457 +0000 UTC m=+0.451167880 container died b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd (image=quay.io/ceph/ceph:v20, name=frosty_kirch, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-3ca44ce071279ffe8e8f3003bb58f867241c249356a01a566add5026a2cf113c-merged.mount: Deactivated successfully.
Dec 03 21:08:04 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/121509719' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:08:04 compute-0 podman[75501]: 2025-12-03 21:08:04.302188514 +0000 UTC m=+0.497532917 container remove b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd (image=quay.io/ceph/ceph:v20, name=frosty_kirch, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 03 21:08:04 compute-0 systemd[1]: libpod-conmon-b00c9920f25eca3171efe402eaf78e998e56a4de574b1ff919706af7f893e4bd.scope: Deactivated successfully.
Dec 03 21:08:04 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'crash'
Dec 03 21:08:04 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'dashboard'
Dec 03 21:08:05 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'devicehealth'
Dec 03 21:08:05 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'diskprediction_local'
Dec 03 21:08:05 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 03 21:08:05 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 03 21:08:05 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]:   from numpy import show_config as show_numpy_config
Dec 03 21:08:05 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'influx'
Dec 03 21:08:05 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'insights'
Dec 03 21:08:05 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'iostat'
Dec 03 21:08:06 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'k8sevents'
Dec 03 21:08:06 compute-0 podman[75587]: 2025-12-03 21:08:06.405706108 +0000 UTC m=+0.067839011 container create 9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c (image=quay.io/ceph/ceph:v20, name=laughing_kapitsa, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:08:06 compute-0 systemd[1]: Started libpod-conmon-9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c.scope.
Dec 03 21:08:06 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'localpool'
Dec 03 21:08:06 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b277afcaa35763dbdc6bf713cdfbadaf880e7676ccf3f62fb5ac0874433d08ff/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b277afcaa35763dbdc6bf713cdfbadaf880e7676ccf3f62fb5ac0874433d08ff/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b277afcaa35763dbdc6bf713cdfbadaf880e7676ccf3f62fb5ac0874433d08ff/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:06 compute-0 podman[75587]: 2025-12-03 21:08:06.382328449 +0000 UTC m=+0.044461392 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:06 compute-0 podman[75587]: 2025-12-03 21:08:06.489917242 +0000 UTC m=+0.152050155 container init 9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c (image=quay.io/ceph/ceph:v20, name=laughing_kapitsa, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:06 compute-0 podman[75587]: 2025-12-03 21:08:06.497277935 +0000 UTC m=+0.159410868 container start 9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c (image=quay.io/ceph/ceph:v20, name=laughing_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030)
Dec 03 21:08:06 compute-0 podman[75587]: 2025-12-03 21:08:06.501503349 +0000 UTC m=+0.163636262 container attach 9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c (image=quay.io/ceph/ceph:v20, name=laughing_kapitsa, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:08:06 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'mds_autoscaler'
Dec 03 21:08:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec 03 21:08:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2727857518' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]: 
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]: {
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "health": {
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "status": "HEALTH_OK",
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "checks": {},
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "mutes": []
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     },
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "election_epoch": 5,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "quorum": [
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         0
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     ],
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "quorum_names": [
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "compute-0"
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     ],
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "quorum_age": 4,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "monmap": {
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "epoch": 1,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "min_mon_release_name": "tentacle",
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_mons": 1
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     },
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "osdmap": {
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "epoch": 1,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_osds": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_up_osds": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "osd_up_since": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_in_osds": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "osd_in_since": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_remapped_pgs": 0
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     },
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "pgmap": {
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "pgs_by_state": [],
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_pgs": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_pools": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_objects": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "data_bytes": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "bytes_used": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "bytes_avail": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "bytes_total": 0
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     },
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "fsmap": {
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "epoch": 1,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "btime": "2025-12-03T21:07:59:373870+0000",
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "by_rank": [],
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "up:standby": 0
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     },
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "mgrmap": {
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "available": false,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "num_standbys": 0,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "modules": [
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:             "iostat",
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:             "nfs"
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         ],
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "services": {}
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     },
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "servicemap": {
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "epoch": 1,
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "modified": "2025-12-03T21:07:59.377140+0000",
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:         "services": {}
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     },
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]:     "progress_events": {}
Dec 03 21:08:06 compute-0 laughing_kapitsa[75603]: }
Dec 03 21:08:06 compute-0 systemd[1]: libpod-9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c.scope: Deactivated successfully.
Dec 03 21:08:06 compute-0 podman[75587]: 2025-12-03 21:08:06.727379381 +0000 UTC m=+0.389512284 container died 9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c (image=quay.io/ceph/ceph:v20, name=laughing_kapitsa, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:08:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-b277afcaa35763dbdc6bf713cdfbadaf880e7676ccf3f62fb5ac0874433d08ff-merged.mount: Deactivated successfully.
Dec 03 21:08:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2727857518' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:08:06 compute-0 podman[75587]: 2025-12-03 21:08:06.776023215 +0000 UTC m=+0.438156108 container remove 9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c (image=quay.io/ceph/ceph:v20, name=laughing_kapitsa, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:08:06 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'mirroring'
Dec 03 21:08:06 compute-0 systemd[1]: libpod-conmon-9fbb584e7041a0523ae33c5b6f57cb02c3d6d0fff872a534c3e619fc543c394c.scope: Deactivated successfully.
Dec 03 21:08:06 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'nfs'
Dec 03 21:08:07 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'orchestrator'
Dec 03 21:08:07 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'osd_perf_query'
Dec 03 21:08:07 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'osd_support'
Dec 03 21:08:07 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'pg_autoscaler'
Dec 03 21:08:07 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'progress'
Dec 03 21:08:07 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'prometheus'
Dec 03 21:08:07 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'rbd_support'
Dec 03 21:08:08 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'rgw'
Dec 03 21:08:08 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'rook'
Dec 03 21:08:08 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'selftest'
Dec 03 21:08:08 compute-0 podman[75641]: 2025-12-03 21:08:08.882782049 +0000 UTC m=+0.073775187 container create d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb (image=quay.io/ceph/ceph:v20, name=naughty_dijkstra, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:08:08 compute-0 systemd[1]: Started libpod-conmon-d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb.scope.
Dec 03 21:08:08 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'smb'
Dec 03 21:08:08 compute-0 podman[75641]: 2025-12-03 21:08:08.850980892 +0000 UTC m=+0.041974100 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:08 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/691a5aafc96b94094f44d25792e17c888960584ed12150e4ada5255d079f3c69/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/691a5aafc96b94094f44d25792e17c888960584ed12150e4ada5255d079f3c69/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/691a5aafc96b94094f44d25792e17c888960584ed12150e4ada5255d079f3c69/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:08 compute-0 podman[75641]: 2025-12-03 21:08:08.977217767 +0000 UTC m=+0.168210955 container init d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb (image=quay.io/ceph/ceph:v20, name=naughty_dijkstra, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:08:08 compute-0 podman[75641]: 2025-12-03 21:08:08.984456406 +0000 UTC m=+0.175449544 container start d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb (image=quay.io/ceph/ceph:v20, name=naughty_dijkstra, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:08 compute-0 podman[75641]: 2025-12-03 21:08:08.988909036 +0000 UTC m=+0.179902164 container attach d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb (image=quay.io/ceph/ceph:v20, name=naughty_dijkstra, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:08:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec 03 21:08:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/829808550' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]: 
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]: {
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "health": {
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "status": "HEALTH_OK",
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "checks": {},
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "mutes": []
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     },
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "election_epoch": 5,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "quorum": [
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         0
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     ],
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "quorum_names": [
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "compute-0"
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     ],
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "quorum_age": 7,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "monmap": {
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "epoch": 1,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "min_mon_release_name": "tentacle",
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_mons": 1
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     },
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "osdmap": {
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "epoch": 1,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_osds": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_up_osds": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "osd_up_since": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_in_osds": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "osd_in_since": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_remapped_pgs": 0
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     },
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "pgmap": {
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "pgs_by_state": [],
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_pgs": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_pools": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_objects": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "data_bytes": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "bytes_used": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "bytes_avail": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "bytes_total": 0
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     },
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "fsmap": {
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "epoch": 1,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "btime": "2025-12-03T21:07:59:373870+0000",
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "by_rank": [],
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "up:standby": 0
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     },
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "mgrmap": {
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "available": false,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "num_standbys": 0,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "modules": [
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:             "iostat",
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:             "nfs"
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         ],
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "services": {}
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     },
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "servicemap": {
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "epoch": 1,
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "modified": "2025-12-03T21:07:59.377140+0000",
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:         "services": {}
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     },
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]:     "progress_events": {}
Dec 03 21:08:09 compute-0 naughty_dijkstra[75658]: }
Dec 03 21:08:09 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'snap_schedule'
Dec 03 21:08:09 compute-0 systemd[1]: libpod-d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb.scope: Deactivated successfully.
Dec 03 21:08:09 compute-0 podman[75641]: 2025-12-03 21:08:09.217317371 +0000 UTC m=+0.408310509 container died d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb (image=quay.io/ceph/ceph:v20, name=naughty_dijkstra, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-691a5aafc96b94094f44d25792e17c888960584ed12150e4ada5255d079f3c69-merged.mount: Deactivated successfully.
Dec 03 21:08:09 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/829808550' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:08:09 compute-0 podman[75641]: 2025-12-03 21:08:09.273483991 +0000 UTC m=+0.464477129 container remove d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb (image=quay.io/ceph/ceph:v20, name=naughty_dijkstra, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:09 compute-0 systemd[1]: libpod-conmon-d0ca0f7bed9c496f975e44266c534d42743ab5f396b8a936c712ed096c2cf0eb.scope: Deactivated successfully.
Dec 03 21:08:09 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'stats'
Dec 03 21:08:09 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'status'
Dec 03 21:08:09 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'telegraf'
Dec 03 21:08:09 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'telemetry'
Dec 03 21:08:09 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'test_orchestrator'
Dec 03 21:08:09 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'volumes'
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: ms_deliver_dispatch: unhandled message 0x56347b571860 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Activating manager daemon compute-0.jxauqt
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr handle_mgr_map Activating!
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr handle_mgr_map I am now activating
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e2: compute-0.jxauqt(active, starting, since 0.012688s)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata"} v 0)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mds metadata"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e1 all = 1
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata"} v 0)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata", "id": "compute-0"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-0.jxauqt", "id": "compute-0.jxauqt"} v 0)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mgr metadata", "who": "compute-0.jxauqt", "id": "compute-0.jxauqt"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: balancer
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [balancer INFO root] Starting
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: crash
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Manager daemon compute-0.jxauqt is now available
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:08:10
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [balancer INFO root] No pools available
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: devicehealth
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [devicehealth INFO root] Starting
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: iostat
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: nfs
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: orchestrator
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: pg_autoscaler
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: progress
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [progress INFO root] Loading...
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [progress INFO root] No stored events to load
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [progress INFO root] Loaded [] historic events
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [progress INFO root] Loaded OSDMap, ready.
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] recovery thread starting
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] starting setup
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: rbd_support
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: status
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: telemetry
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/mirror_snapshot_schedule"} v 0)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/mirror_snapshot_schedule"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/report_id}] v 0)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] PerfHandler: starting
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TaskHandler: starting
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/trash_purge_schedule"} v 0)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/trash_purge_schedule"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/salt}] v 0)
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: [rbd_support INFO root] setup complete
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:10 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: volumes
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/telemetry/collection}] v 0)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:10 compute-0 ceph-mon[75204]: Activating manager daemon compute-0.jxauqt
Dec 03 21:08:10 compute-0 ceph-mon[75204]: mgrmap e2: compute-0.jxauqt(active, starting, since 0.012688s)
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mds metadata"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata", "id": "compute-0"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mgr metadata", "who": "compute-0.jxauqt", "id": "compute-0.jxauqt"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: Manager daemon compute-0.jxauqt is now available
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/mirror_snapshot_schedule"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/trash_purge_schedule"} : dispatch
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:10 compute-0 ceph-mon[75204]: from='mgr.14102 192.168.122.100:0/355953899' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:11 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e3: compute-0.jxauqt(active, since 1.02768s)
Dec 03 21:08:11 compute-0 podman[75774]: 2025-12-03 21:08:11.377563499 +0000 UTC m=+0.070424085 container create 365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f (image=quay.io/ceph/ceph:v20, name=jovial_ptolemy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:11 compute-0 systemd[1]: Started libpod-conmon-365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f.scope.
Dec 03 21:08:11 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:11 compute-0 podman[75774]: 2025-12-03 21:08:11.34894817 +0000 UTC m=+0.041808806 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4c7bf5af164041e03e645a94edd7926ba2d9933da01eeaf2bc9f8a516d63294/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4c7bf5af164041e03e645a94edd7926ba2d9933da01eeaf2bc9f8a516d63294/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4c7bf5af164041e03e645a94edd7926ba2d9933da01eeaf2bc9f8a516d63294/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:11 compute-0 podman[75774]: 2025-12-03 21:08:11.452424902 +0000 UTC m=+0.145285468 container init 365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f (image=quay.io/ceph/ceph:v20, name=jovial_ptolemy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:08:11 compute-0 podman[75774]: 2025-12-03 21:08:11.456831521 +0000 UTC m=+0.149692067 container start 365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f (image=quay.io/ceph/ceph:v20, name=jovial_ptolemy, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:11 compute-0 podman[75774]: 2025-12-03 21:08:11.459927177 +0000 UTC m=+0.152787733 container attach 365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f (image=quay.io/ceph/ceph:v20, name=jovial_ptolemy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle)
Dec 03 21:08:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec 03 21:08:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2867261980' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]: 
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]: {
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "health": {
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "status": "HEALTH_OK",
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "checks": {},
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "mutes": []
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     },
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "election_epoch": 5,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "quorum": [
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         0
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     ],
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "quorum_names": [
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "compute-0"
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     ],
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "quorum_age": 10,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "monmap": {
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "epoch": 1,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "min_mon_release_name": "tentacle",
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_mons": 1
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     },
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "osdmap": {
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "epoch": 1,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_osds": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_up_osds": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "osd_up_since": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_in_osds": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "osd_in_since": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_remapped_pgs": 0
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     },
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "pgmap": {
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "pgs_by_state": [],
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_pgs": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_pools": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_objects": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "data_bytes": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "bytes_used": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "bytes_avail": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "bytes_total": 0
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     },
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "fsmap": {
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "epoch": 1,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "btime": "2025-12-03T21:07:59:373870+0000",
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "by_rank": [],
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "up:standby": 0
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     },
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "mgrmap": {
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "available": true,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "num_standbys": 0,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "modules": [
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:             "iostat",
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:             "nfs"
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         ],
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "services": {}
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     },
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "servicemap": {
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "epoch": 1,
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "modified": "2025-12-03T21:07:59.377140+0000",
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:         "services": {}
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     },
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]:     "progress_events": {}
Dec 03 21:08:11 compute-0 jovial_ptolemy[75790]: }
Dec 03 21:08:11 compute-0 systemd[1]: libpod-365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f.scope: Deactivated successfully.
Dec 03 21:08:11 compute-0 podman[75774]: 2025-12-03 21:08:11.981636952 +0000 UTC m=+0.674497498 container died 365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f (image=quay.io/ceph/ceph:v20, name=jovial_ptolemy, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:08:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-f4c7bf5af164041e03e645a94edd7926ba2d9933da01eeaf2bc9f8a516d63294-merged.mount: Deactivated successfully.
Dec 03 21:08:12 compute-0 podman[75774]: 2025-12-03 21:08:12.033942697 +0000 UTC m=+0.726803283 container remove 365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f (image=quay.io/ceph/ceph:v20, name=jovial_ptolemy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:08:12 compute-0 systemd[1]: libpod-conmon-365945f97c1efdeea8e37357856d7fe61a63bcba1d09a71873613d28b9e1fc8f.scope: Deactivated successfully.
Dec 03 21:08:12 compute-0 podman[75829]: 2025-12-03 21:08:12.134505447 +0000 UTC m=+0.068519587 container create c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69 (image=quay.io/ceph/ceph:v20, name=amazing_banzai, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:08:12 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:12 compute-0 systemd[1]: Started libpod-conmon-c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69.scope.
Dec 03 21:08:12 compute-0 ceph-mon[75204]: mgrmap e3: compute-0.jxauqt(active, since 1.02768s)
Dec 03 21:08:12 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2867261980' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:08:12 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:12 compute-0 podman[75829]: 2025-12-03 21:08:12.105018496 +0000 UTC m=+0.039032676 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:12 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e4: compute-0.jxauqt(active, since 2s)
Dec 03 21:08:12 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfbe208bd9ca0709a6492187b58d9c27119f76e0a809027689a9e341ec9c5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfbe208bd9ca0709a6492187b58d9c27119f76e0a809027689a9e341ec9c5/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfbe208bd9ca0709a6492187b58d9c27119f76e0a809027689a9e341ec9c5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfbe208bd9ca0709a6492187b58d9c27119f76e0a809027689a9e341ec9c5/merged/var/lib/ceph/user.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:12 compute-0 podman[75829]: 2025-12-03 21:08:12.237285852 +0000 UTC m=+0.171299992 container init c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69 (image=quay.io/ceph/ceph:v20, name=amazing_banzai, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:12 compute-0 podman[75829]: 2025-12-03 21:08:12.247834742 +0000 UTC m=+0.181848872 container start c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69 (image=quay.io/ceph/ceph:v20, name=amazing_banzai, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3)
Dec 03 21:08:12 compute-0 podman[75829]: 2025-12-03 21:08:12.252137119 +0000 UTC m=+0.186151309 container attach c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69 (image=quay.io/ceph/ceph:v20, name=amazing_banzai, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config assimilate-conf"} v 0)
Dec 03 21:08:12 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1362526996' entity='client.admin' cmd={"prefix": "config assimilate-conf"} : dispatch
Dec 03 21:08:12 compute-0 amazing_banzai[75846]: 
Dec 03 21:08:12 compute-0 amazing_banzai[75846]: [global]
Dec 03 21:08:12 compute-0 amazing_banzai[75846]:         fsid = c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:12 compute-0 amazing_banzai[75846]:         mon_host = [v2:192.168.122.100:3300,v1:192.168.122.100:6789]
Dec 03 21:08:12 compute-0 amazing_banzai[75846]:         osd_crush_chooseleaf_type = 0
Dec 03 21:08:12 compute-0 systemd[1]: libpod-c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69.scope: Deactivated successfully.
Dec 03 21:08:12 compute-0 podman[75829]: 2025-12-03 21:08:12.677654353 +0000 UTC m=+0.611668513 container died c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69 (image=quay.io/ceph/ceph:v20, name=amazing_banzai, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:08:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-bbbcfbe208bd9ca0709a6492187b58d9c27119f76e0a809027689a9e341ec9c5-merged.mount: Deactivated successfully.
Dec 03 21:08:12 compute-0 podman[75829]: 2025-12-03 21:08:12.71837187 +0000 UTC m=+0.652385970 container remove c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69 (image=quay.io/ceph/ceph:v20, name=amazing_banzai, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:08:12 compute-0 systemd[1]: libpod-conmon-c1a0a7c3309e80aae883d82985494f464a34503b9f45b64aa543f8a6caae0c69.scope: Deactivated successfully.
Dec 03 21:08:12 compute-0 podman[75885]: 2025-12-03 21:08:12.800495574 +0000 UTC m=+0.056712785 container create ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572 (image=quay.io/ceph/ceph:v20, name=amazing_yalow, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:12 compute-0 systemd[1]: Started libpod-conmon-ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572.scope.
Dec 03 21:08:12 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2b6f5265776cd1e7e8c0e3843cfa5075ce901cd1499302428e39487089036c4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2b6f5265776cd1e7e8c0e3843cfa5075ce901cd1499302428e39487089036c4/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2b6f5265776cd1e7e8c0e3843cfa5075ce901cd1499302428e39487089036c4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:12 compute-0 podman[75885]: 2025-12-03 21:08:12.772522392 +0000 UTC m=+0.028739643 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:12 compute-0 podman[75885]: 2025-12-03 21:08:12.88314435 +0000 UTC m=+0.139361601 container init ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572 (image=quay.io/ceph/ceph:v20, name=amazing_yalow, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:08:12 compute-0 podman[75885]: 2025-12-03 21:08:12.892621285 +0000 UTC m=+0.148838456 container start ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572 (image=quay.io/ceph/ceph:v20, name=amazing_yalow, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:12 compute-0 podman[75885]: 2025-12-03 21:08:12.896561852 +0000 UTC m=+0.152779043 container attach ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572 (image=quay.io/ceph/ceph:v20, name=amazing_yalow, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:08:13 compute-0 ceph-mon[75204]: mgrmap e4: compute-0.jxauqt(active, since 2s)
Dec 03 21:08:13 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1362526996' entity='client.admin' cmd={"prefix": "config assimilate-conf"} : dispatch
Dec 03 21:08:13 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0)
Dec 03 21:08:13 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1711009155' entity='client.admin' cmd={"prefix": "mgr module enable", "module": "cephadm"} : dispatch
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:14 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1711009155' entity='client.admin' cmd={"prefix": "mgr module enable", "module": "cephadm"} : dispatch
Dec 03 21:08:14 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1711009155' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  1: '-n'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  2: 'mgr.compute-0.jxauqt'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  3: '-f'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  4: '--setuser'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  5: 'ceph'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  6: '--setgroup'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  7: 'ceph'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  8: '--default-log-to-file=false'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  9: '--default-log-to-journald=true'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr respawn  exe_path /proc/self/exe
Dec 03 21:08:14 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e5: compute-0.jxauqt(active, since 4s)
Dec 03 21:08:14 compute-0 systemd[1]: libpod-ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572.scope: Deactivated successfully.
Dec 03 21:08:14 compute-0 podman[75885]: 2025-12-03 21:08:14.374878068 +0000 UTC m=+1.631095259 container died ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572 (image=quay.io/ceph/ceph:v20, name=amazing_yalow, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle)
Dec 03 21:08:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-d2b6f5265776cd1e7e8c0e3843cfa5075ce901cd1499302428e39487089036c4-merged.mount: Deactivated successfully.
Dec 03 21:08:14 compute-0 podman[75885]: 2025-12-03 21:08:14.416681453 +0000 UTC m=+1.672898634 container remove ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572 (image=quay.io/ceph/ceph:v20, name=amazing_yalow, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:08:14 compute-0 systemd[1]: libpod-conmon-ea5540398642d6005d3abdf2cf973eb64190e828c8d36b15725a5a325a410572.scope: Deactivated successfully.
Dec 03 21:08:14 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: ignoring --setuser ceph since I am not root
Dec 03 21:08:14 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: ignoring --setgroup ceph since I am not root
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-mgr, pid 2
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: pidfile_write: ignore empty --pid-file
Dec 03 21:08:14 compute-0 podman[75938]: 2025-12-03 21:08:14.497256317 +0000 UTC m=+0.058951361 container create 92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2 (image=quay.io/ceph/ceph:v20, name=unruffled_lehmann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'alerts'
Dec 03 21:08:14 compute-0 systemd[1]: Started libpod-conmon-92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2.scope.
Dec 03 21:08:14 compute-0 podman[75938]: 2025-12-03 21:08:14.469829879 +0000 UTC m=+0.031524993 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:14 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3df54f66bb4e31dfaedb7c2751db292414d2130a21981565c0e267c097294dc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3df54f66bb4e31dfaedb7c2751db292414d2130a21981565c0e267c097294dc/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3df54f66bb4e31dfaedb7c2751db292414d2130a21981565c0e267c097294dc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'balancer'
Dec 03 21:08:14 compute-0 podman[75938]: 2025-12-03 21:08:14.602226596 +0000 UTC m=+0.163921670 container init 92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2 (image=quay.io/ceph/ceph:v20, name=unruffled_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:08:14 compute-0 podman[75938]: 2025-12-03 21:08:14.609426514 +0000 UTC m=+0.171121578 container start 92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2 (image=quay.io/ceph/ceph:v20, name=unruffled_lehmann, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:08:14 compute-0 podman[75938]: 2025-12-03 21:08:14.613637788 +0000 UTC m=+0.175332822 container attach 92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2 (image=quay.io/ceph/ceph:v20, name=unruffled_lehmann, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Dec 03 21:08:14 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'cephadm'
Dec 03 21:08:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 03 21:08:15 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1638651339' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 03 21:08:15 compute-0 unruffled_lehmann[75974]: {
Dec 03 21:08:15 compute-0 unruffled_lehmann[75974]:     "epoch": 5,
Dec 03 21:08:15 compute-0 unruffled_lehmann[75974]:     "available": true,
Dec 03 21:08:15 compute-0 unruffled_lehmann[75974]:     "active_name": "compute-0.jxauqt",
Dec 03 21:08:15 compute-0 unruffled_lehmann[75974]:     "num_standby": 0
Dec 03 21:08:15 compute-0 unruffled_lehmann[75974]: }
Dec 03 21:08:15 compute-0 systemd[1]: libpod-92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2.scope: Deactivated successfully.
Dec 03 21:08:15 compute-0 podman[75938]: 2025-12-03 21:08:15.125642463 +0000 UTC m=+0.687337497 container died 92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2 (image=quay.io/ceph/ceph:v20, name=unruffled_lehmann, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 03 21:08:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-f3df54f66bb4e31dfaedb7c2751db292414d2130a21981565c0e267c097294dc-merged.mount: Deactivated successfully.
Dec 03 21:08:15 compute-0 podman[75938]: 2025-12-03 21:08:15.160770153 +0000 UTC m=+0.722465187 container remove 92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2 (image=quay.io/ceph/ceph:v20, name=unruffled_lehmann, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:15 compute-0 systemd[1]: libpod-conmon-92f34ac4fc50af42c3d00b6f08f8b8efd2852d65de3e90ba1848191eea754aa2.scope: Deactivated successfully.
Dec 03 21:08:15 compute-0 podman[76023]: 2025-12-03 21:08:15.218718917 +0000 UTC m=+0.040011521 container create 6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799 (image=quay.io/ceph/ceph:v20, name=agitated_solomon, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:15 compute-0 systemd[1]: Started libpod-conmon-6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799.scope.
Dec 03 21:08:15 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb8808b3f871a4539265f66bc53d7bbb252ede060b9af73bba2e0b3b3851f3e8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb8808b3f871a4539265f66bc53d7bbb252ede060b9af73bba2e0b3b3851f3e8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb8808b3f871a4539265f66bc53d7bbb252ede060b9af73bba2e0b3b3851f3e8/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:15 compute-0 podman[76023]: 2025-12-03 21:08:15.284723852 +0000 UTC m=+0.106016486 container init 6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799 (image=quay.io/ceph/ceph:v20, name=agitated_solomon, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:15 compute-0 podman[76023]: 2025-12-03 21:08:15.294411071 +0000 UTC m=+0.115703675 container start 6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799 (image=quay.io/ceph/ceph:v20, name=agitated_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:15 compute-0 podman[76023]: 2025-12-03 21:08:15.200795604 +0000 UTC m=+0.022088248 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:15 compute-0 podman[76023]: 2025-12-03 21:08:15.297880237 +0000 UTC m=+0.119172841 container attach 6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799 (image=quay.io/ceph/ceph:v20, name=agitated_solomon, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:15 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1711009155' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished
Dec 03 21:08:15 compute-0 ceph-mon[75204]: mgrmap e5: compute-0.jxauqt(active, since 4s)
Dec 03 21:08:15 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1638651339' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 03 21:08:15 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'crash'
Dec 03 21:08:15 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'dashboard'
Dec 03 21:08:16 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'devicehealth'
Dec 03 21:08:16 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'diskprediction_local'
Dec 03 21:08:16 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 03 21:08:16 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 03 21:08:16 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]:   from numpy import show_config as show_numpy_config
Dec 03 21:08:16 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'influx'
Dec 03 21:08:16 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'insights'
Dec 03 21:08:16 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'iostat'
Dec 03 21:08:16 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'k8sevents'
Dec 03 21:08:17 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'localpool'
Dec 03 21:08:17 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'mds_autoscaler'
Dec 03 21:08:17 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'mirroring'
Dec 03 21:08:17 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'nfs'
Dec 03 21:08:17 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'orchestrator'
Dec 03 21:08:18 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'osd_perf_query'
Dec 03 21:08:18 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'osd_support'
Dec 03 21:08:18 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'pg_autoscaler'
Dec 03 21:08:18 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'progress'
Dec 03 21:08:18 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'prometheus'
Dec 03 21:08:18 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'rbd_support'
Dec 03 21:08:18 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'rgw'
Dec 03 21:08:19 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'rook'
Dec 03 21:08:19 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'selftest'
Dec 03 21:08:19 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'smb'
Dec 03 21:08:20 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'snap_schedule'
Dec 03 21:08:20 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'stats'
Dec 03 21:08:20 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'status'
Dec 03 21:08:20 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'telegraf'
Dec 03 21:08:20 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'telemetry'
Dec 03 21:08:20 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'test_orchestrator'
Dec 03 21:08:20 compute-0 ceph-mgr[75500]: mgr[py] Loading python module 'volumes'
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Active manager daemon compute-0.jxauqt restarted
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e1 do_prune osdmap full prune enabled
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e1 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Activating manager daemon compute-0.jxauqt
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: ms_deliver_dispatch: unhandled message 0x55f9c7438000 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e1 _set_cache_ratios kv ratio 0.2 inc ratio 0.4 full ratio 0.4
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e1 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 e2: 0 total, 0 up, 0 in
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr handle_mgr_map Activating!
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr handle_mgr_map I am now activating
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e2: 0 total, 0 up, 0 in
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e6: compute-0.jxauqt(active, starting, since 0.0584135s)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata", "id": "compute-0"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "who": "compute-0.jxauqt", "id": "compute-0.jxauqt"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mgr metadata", "who": "compute-0.jxauqt", "id": "compute-0.jxauqt"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mds metadata"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e1 all = 1
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: balancer
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Starting
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Manager daemon compute-0.jxauqt is now available
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:08:21
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [balancer INFO root] No pools available
Dec 03 21:08:21 compute-0 ceph-mon[75204]: Active manager daemon compute-0.jxauqt restarted
Dec 03 21:08:21 compute-0 ceph-mon[75204]: Activating manager daemon compute-0.jxauqt
Dec 03 21:08:21 compute-0 ceph-mon[75204]: osdmap e2: 0 total, 0 up, 0 in
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mgrmap e6: compute-0.jxauqt(active, starting, since 0.0584135s)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata", "id": "compute-0"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mgr metadata", "who": "compute-0.jxauqt", "id": "compute-0.jxauqt"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mds metadata"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mon[75204]: Manager daemon compute-0.jxauqt is now available
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/cert_store.cert.cephadm_root_ca_cert}] v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/cert_store.key.cephadm_root_ca_key}] v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.migrations] Found migration_current of "None". Setting to last migration.
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Found migration_current of "None". Setting to last migration.
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/migration_current}] v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1019914911 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/config_checks}] v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: cephadm
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: crash
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: devicehealth
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: iostat
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: nfs
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [devicehealth INFO root] Starting
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: orchestrator
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: pg_autoscaler
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: progress
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [progress INFO root] Loading...
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [progress INFO root] No stored events to load
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [progress INFO root] Loaded [] historic events
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [progress INFO root] Loaded OSDMap, ready.
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] recovery thread starting
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] starting setup
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: rbd_support
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: status
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: telemetry
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/mirror_snapshot_schedule"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/mirror_snapshot_schedule"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] PerfHandler: starting
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TaskHandler: starting
Dec 03 21:08:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/trash_purge_schedule"} v 0)
Dec 03 21:08:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/trash_purge_schedule"} : dispatch
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] setup complete
Dec 03 21:08:21 compute-0 ceph-mgr[75500]: mgr load Constructed class from module: volumes
Dec 03 21:08:22 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e7: compute-0.jxauqt(active, since 1.06777s)
Dec 03 21:08:22 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14126 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch
Dec 03 21:08:22 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14126 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch
Dec 03 21:08:22 compute-0 agitated_solomon[76039]: {
Dec 03 21:08:22 compute-0 agitated_solomon[76039]:     "mgrmap_epoch": 7,
Dec 03 21:08:22 compute-0 agitated_solomon[76039]:     "initialized": true
Dec 03 21:08:22 compute-0 agitated_solomon[76039]: }
Dec 03 21:08:22 compute-0 systemd[1]: libpod-6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799.scope: Deactivated successfully.
Dec 03 21:08:22 compute-0 podman[76023]: 2025-12-03 21:08:22.161874578 +0000 UTC m=+6.983167212 container died 6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799 (image=quay.io/ceph/ceph:v20, name=agitated_solomon, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-cb8808b3f871a4539265f66bc53d7bbb252ede060b9af73bba2e0b3b3851f3e8-merged.mount: Deactivated successfully.
Dec 03 21:08:22 compute-0 podman[76023]: 2025-12-03 21:08:22.220085769 +0000 UTC m=+7.041378413 container remove 6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799 (image=quay.io/ceph/ceph:v20, name=agitated_solomon, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:22 compute-0 systemd[1]: libpod-conmon-6f285a3045bc7e29d550ccf3a12dbf0d925dd7e92be59df31f4277037bc96799.scope: Deactivated successfully.
Dec 03 21:08:22 compute-0 podman[76185]: 2025-12-03 21:08:22.286667088 +0000 UTC m=+0.044408471 container create aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf (image=quay.io/ceph/ceph:v20, name=bold_sutherland, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:08:22 compute-0 systemd[1]: Started libpod-conmon-aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf.scope.
Dec 03 21:08:22 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37dcf0ea3f9e86b0a6c01327bc93b8c8b2b43fd26fd1a8b03c30504f359e6a52/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37dcf0ea3f9e86b0a6c01327bc93b8c8b2b43fd26fd1a8b03c30504f359e6a52/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37dcf0ea3f9e86b0a6c01327bc93b8c8b2b43fd26fd1a8b03c30504f359e6a52/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:22 compute-0 podman[76185]: 2025-12-03 21:08:22.360532566 +0000 UTC m=+0.118273969 container init aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf (image=quay.io/ceph/ceph:v20, name=bold_sutherland, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:08:22 compute-0 podman[76185]: 2025-12-03 21:08:22.266393676 +0000 UTC m=+0.024135109 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:22 compute-0 podman[76185]: 2025-12-03 21:08:22.36593231 +0000 UTC m=+0.123673703 container start aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf (image=quay.io/ceph/ceph:v20, name=bold_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:08:22 compute-0 podman[76185]: 2025-12-03 21:08:22.369841097 +0000 UTC m=+0.127582500 container attach aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf (image=quay.io/ceph/ceph:v20, name=bold_sutherland, org.label-schema.build-date=20251030, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:22 compute-0 ceph-mon[75204]: Found migration_current of "None". Setting to last migration.
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/mirror_snapshot_schedule"} : dispatch
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.jxauqt/trash_purge_schedule"} : dispatch
Dec 03 21:08:22 compute-0 ceph-mon[75204]: mgrmap e7: compute-0.jxauqt(active, since 1.06777s)
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='client.14126 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch
Dec 03 21:08:22 compute-0 ceph-mon[75204]: from='client.14126 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch
Dec 03 21:08:22 compute-0 ceph-mgr[75500]: [cephadm INFO cherrypy.error] [03/Dec/2025:21:08:22] ENGINE Bus STARTING
Dec 03 21:08:22 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : [03/Dec/2025:21:08:22] ENGINE Bus STARTING
Dec 03 21:08:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module enable", "module": "orchestrator"} v 0)
Dec 03 21:08:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2846862938' entity='client.admin' cmd={"prefix": "mgr module enable", "module": "orchestrator"} : dispatch
Dec 03 21:08:22 compute-0 ceph-mgr[75500]: [cephadm INFO cherrypy.error] [03/Dec/2025:21:08:22] ENGINE Serving on http://192.168.122.100:8765
Dec 03 21:08:22 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : [03/Dec/2025:21:08:22] ENGINE Serving on http://192.168.122.100:8765
Dec 03 21:08:23 compute-0 ceph-mgr[75500]: [cephadm INFO cherrypy.error] [03/Dec/2025:21:08:23] ENGINE Serving on https://192.168.122.100:7150
Dec 03 21:08:23 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : [03/Dec/2025:21:08:23] ENGINE Serving on https://192.168.122.100:7150
Dec 03 21:08:23 compute-0 ceph-mgr[75500]: [cephadm INFO cherrypy.error] [03/Dec/2025:21:08:23] ENGINE Bus STARTED
Dec 03 21:08:23 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : [03/Dec/2025:21:08:23] ENGINE Bus STARTED
Dec 03 21:08:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Dec 03 21:08:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:23 compute-0 ceph-mgr[75500]: [cephadm INFO cherrypy.error] [03/Dec/2025:21:08:23] ENGINE Client ('192.168.122.100', 49586) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 03 21:08:23 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : [03/Dec/2025:21:08:23] ENGINE Client ('192.168.122.100', 49586) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 03 21:08:23 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:23 compute-0 ceph-mon[75204]: [03/Dec/2025:21:08:22] ENGINE Bus STARTING
Dec 03 21:08:23 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2846862938' entity='client.admin' cmd={"prefix": "mgr module enable", "module": "orchestrator"} : dispatch
Dec 03 21:08:23 compute-0 ceph-mon[75204]: [03/Dec/2025:21:08:22] ENGINE Serving on http://192.168.122.100:8765
Dec 03 21:08:23 compute-0 ceph-mon[75204]: [03/Dec/2025:21:08:23] ENGINE Serving on https://192.168.122.100:7150
Dec 03 21:08:23 compute-0 ceph-mon[75204]: [03/Dec/2025:21:08:23] ENGINE Bus STARTED
Dec 03 21:08:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:23 compute-0 ceph-mon[75204]: [03/Dec/2025:21:08:23] ENGINE Client ('192.168.122.100', 49586) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 03 21:08:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2846862938' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "orchestrator"}]': finished
Dec 03 21:08:23 compute-0 bold_sutherland[76201]: module 'orchestrator' is already enabled (always-on)
Dec 03 21:08:23 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e8: compute-0.jxauqt(active, since 2s)
Dec 03 21:08:23 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:23 compute-0 systemd[1]: libpod-aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf.scope: Deactivated successfully.
Dec 03 21:08:23 compute-0 podman[76185]: 2025-12-03 21:08:23.745159563 +0000 UTC m=+1.502901006 container died aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf (image=quay.io/ceph/ceph:v20, name=bold_sutherland, CEPH_REF=tentacle, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:08:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-37dcf0ea3f9e86b0a6c01327bc93b8c8b2b43fd26fd1a8b03c30504f359e6a52-merged.mount: Deactivated successfully.
Dec 03 21:08:23 compute-0 podman[76185]: 2025-12-03 21:08:23.792100375 +0000 UTC m=+1.549841768 container remove aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf (image=quay.io/ceph/ceph:v20, name=bold_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:08:23 compute-0 systemd[1]: libpod-conmon-aa239cefdcac36c44ad2eb0bb12cb57c4107976138c81ce71dffc76dcbaccfaf.scope: Deactivated successfully.
Dec 03 21:08:23 compute-0 podman[76262]: 2025-12-03 21:08:23.856414647 +0000 UTC m=+0.047238811 container create bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779 (image=quay.io/ceph/ceph:v20, name=optimistic_chebyshev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:23 compute-0 systemd[1]: Started libpod-conmon-bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779.scope.
Dec 03 21:08:23 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17882cb9eef892dc18564498f212051334045edeee853f24a3caf07db2f3f814/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17882cb9eef892dc18564498f212051334045edeee853f24a3caf07db2f3f814/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17882cb9eef892dc18564498f212051334045edeee853f24a3caf07db2f3f814/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:23 compute-0 podman[76262]: 2025-12-03 21:08:23.832652349 +0000 UTC m=+0.023476593 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:23 compute-0 podman[76262]: 2025-12-03 21:08:23.932584363 +0000 UTC m=+0.123408547 container init bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779 (image=quay.io/ceph/ceph:v20, name=optimistic_chebyshev, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:08:23 compute-0 podman[76262]: 2025-12-03 21:08:23.950704611 +0000 UTC m=+0.141528775 container start bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779 (image=quay.io/ceph/ceph:v20, name=optimistic_chebyshev, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:23 compute-0 podman[76262]: 2025-12-03 21:08:23.955044049 +0000 UTC m=+0.145868233 container attach bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779 (image=quay.io/ceph/ceph:v20, name=optimistic_chebyshev, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:24 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14136 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/orchestrator/orchestrator}] v 0)
Dec 03 21:08:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Dec 03 21:08:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:24 compute-0 systemd[1]: libpod-bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779.scope: Deactivated successfully.
Dec 03 21:08:24 compute-0 podman[76262]: 2025-12-03 21:08:24.440983109 +0000 UTC m=+0.631807293 container died bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779 (image=quay.io/ceph/ceph:v20, name=optimistic_chebyshev, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030)
Dec 03 21:08:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-17882cb9eef892dc18564498f212051334045edeee853f24a3caf07db2f3f814-merged.mount: Deactivated successfully.
Dec 03 21:08:24 compute-0 podman[76262]: 2025-12-03 21:08:24.502892521 +0000 UTC m=+0.693716695 container remove bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779 (image=quay.io/ceph/ceph:v20, name=optimistic_chebyshev, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:24 compute-0 systemd[1]: libpod-conmon-bad730a94e0e312c7595b53e179b264cf6735213a5a13ba41e4e0c74fb2af779.scope: Deactivated successfully.
Dec 03 21:08:24 compute-0 podman[76317]: 2025-12-03 21:08:24.556922208 +0000 UTC m=+0.037386326 container create 20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df (image=quay.io/ceph/ceph:v20, name=pensive_antonelli, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, ceph=True)
Dec 03 21:08:24 compute-0 systemd[1]: Started libpod-conmon-20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df.scope.
Dec 03 21:08:24 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/821a196636743973932c9101c237b7efee32866e7fb11369e83d932fa291dd43/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/821a196636743973932c9101c237b7efee32866e7fb11369e83d932fa291dd43/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/821a196636743973932c9101c237b7efee32866e7fb11369e83d932fa291dd43/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:24 compute-0 podman[76317]: 2025-12-03 21:08:24.634720524 +0000 UTC m=+0.115184712 container init 20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df (image=quay.io/ceph/ceph:v20, name=pensive_antonelli, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:24 compute-0 podman[76317]: 2025-12-03 21:08:24.539244421 +0000 UTC m=+0.019708559 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:24 compute-0 podman[76317]: 2025-12-03 21:08:24.641361499 +0000 UTC m=+0.121825637 container start 20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df (image=quay.io/ceph/ceph:v20, name=pensive_antonelli, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20251030)
Dec 03 21:08:24 compute-0 podman[76317]: 2025-12-03 21:08:24.644914347 +0000 UTC m=+0.125378545 container attach 20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df (image=quay.io/ceph/ceph:v20, name=pensive_antonelli, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:08:24 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2846862938' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "orchestrator"}]': finished
Dec 03 21:08:24 compute-0 ceph-mon[75204]: mgrmap e8: compute-0.jxauqt(active, since 2s)
Dec 03 21:08:24 compute-0 ceph-mon[75204]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14138 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "ceph-admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_user}] v 0)
Dec 03 21:08:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: [cephadm INFO root] Set ssh ssh_user
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Set ssh ssh_user
Dec 03 21:08:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_config}] v 0)
Dec 03 21:08:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: [cephadm INFO root] Set ssh ssh_config
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Set ssh ssh_config
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: [cephadm INFO root] ssh user set to ceph-admin. sudo will be used
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : ssh user set to ceph-admin. sudo will be used
Dec 03 21:08:25 compute-0 pensive_antonelli[76334]: ssh user set to ceph-admin. sudo will be used
Dec 03 21:08:25 compute-0 systemd[1]: libpod-20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df.scope: Deactivated successfully.
Dec 03 21:08:25 compute-0 podman[76317]: 2025-12-03 21:08:25.072169353 +0000 UTC m=+0.552633491 container died 20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df (image=quay.io/ceph/ceph:v20, name=pensive_antonelli, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:08:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-821a196636743973932c9101c237b7efee32866e7fb11369e83d932fa291dd43-merged.mount: Deactivated successfully.
Dec 03 21:08:25 compute-0 podman[76317]: 2025-12-03 21:08:25.125473913 +0000 UTC m=+0.605938061 container remove 20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df (image=quay.io/ceph/ceph:v20, name=pensive_antonelli, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2)
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:25 compute-0 systemd[1]: libpod-conmon-20cc8fd3273905d87093de465b056bcd5e80f4ac687f9085365d5b5c8c8377df.scope: Deactivated successfully.
Dec 03 21:08:25 compute-0 podman[76374]: 2025-12-03 21:08:25.19765757 +0000 UTC m=+0.055224748 container create d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f (image=quay.io/ceph/ceph:v20, name=funny_darwin, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:25 compute-0 systemd[1]: Started libpod-conmon-d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f.scope.
Dec 03 21:08:25 compute-0 podman[76374]: 2025-12-03 21:08:25.168061598 +0000 UTC m=+0.025628846 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:25 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28f1fa760d7c2cb668456c0ce4d737f4a614b9629b7dcc31063dceea44d190ac/merged/tmp/cephadm-ssh-key supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28f1fa760d7c2cb668456c0ce4d737f4a614b9629b7dcc31063dceea44d190ac/merged/tmp/cephadm-ssh-key.pub supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28f1fa760d7c2cb668456c0ce4d737f4a614b9629b7dcc31063dceea44d190ac/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28f1fa760d7c2cb668456c0ce4d737f4a614b9629b7dcc31063dceea44d190ac/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/28f1fa760d7c2cb668456c0ce4d737f4a614b9629b7dcc31063dceea44d190ac/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 podman[76374]: 2025-12-03 21:08:25.298660191 +0000 UTC m=+0.156227359 container init d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f (image=quay.io/ceph/ceph:v20, name=funny_darwin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 03 21:08:25 compute-0 podman[76374]: 2025-12-03 21:08:25.310780421 +0000 UTC m=+0.168347599 container start d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f (image=quay.io/ceph/ceph:v20, name=funny_darwin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:25 compute-0 podman[76374]: 2025-12-03 21:08:25.314874002 +0000 UTC m=+0.172441180 container attach d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f (image=quay.io/ceph/ceph:v20, name=funny_darwin, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14140 -' entity='client.admin' cmd=[{"prefix": "cephadm set-priv-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_identity_key}] v 0)
Dec 03 21:08:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: [cephadm INFO root] Set ssh ssh_identity_key
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Set ssh ssh_identity_key
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: [cephadm INFO root] Set ssh private key
Dec 03 21:08:25 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Set ssh private key
Dec 03 21:08:25 compute-0 systemd[1]: libpod-d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f.scope: Deactivated successfully.
Dec 03 21:08:25 compute-0 podman[76374]: 2025-12-03 21:08:25.761080868 +0000 UTC m=+0.618648046 container died d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f (image=quay.io/ceph/ceph:v20, name=funny_darwin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-28f1fa760d7c2cb668456c0ce4d737f4a614b9629b7dcc31063dceea44d190ac-merged.mount: Deactivated successfully.
Dec 03 21:08:25 compute-0 podman[76374]: 2025-12-03 21:08:25.798475754 +0000 UTC m=+0.656042902 container remove d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f (image=quay.io/ceph/ceph:v20, name=funny_darwin, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:25 compute-0 systemd[1]: libpod-conmon-d1d3ca8cb9029d698fb4afd45382ff8590ce15333c8212c069b2c8b30f0c413f.scope: Deactivated successfully.
Dec 03 21:08:25 compute-0 podman[76428]: 2025-12-03 21:08:25.874677871 +0000 UTC m=+0.057330821 container create 3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827 (image=quay.io/ceph/ceph:v20, name=gallant_bhabha, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:25 compute-0 systemd[1]: Started libpod-conmon-3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827.scope.
Dec 03 21:08:25 compute-0 podman[76428]: 2025-12-03 21:08:25.842801851 +0000 UTC m=+0.025454861 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:25 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6071a846085fa60d580b71e5dcb1cedd246ac315379a174eea184c5710763a72/merged/tmp/cephadm-ssh-key supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6071a846085fa60d580b71e5dcb1cedd246ac315379a174eea184c5710763a72/merged/tmp/cephadm-ssh-key.pub supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6071a846085fa60d580b71e5dcb1cedd246ac315379a174eea184c5710763a72/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6071a846085fa60d580b71e5dcb1cedd246ac315379a174eea184c5710763a72/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6071a846085fa60d580b71e5dcb1cedd246ac315379a174eea184c5710763a72/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:25 compute-0 podman[76428]: 2025-12-03 21:08:25.971180989 +0000 UTC m=+0.153833969 container init 3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827 (image=quay.io/ceph/ceph:v20, name=gallant_bhabha, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:08:25 compute-0 podman[76428]: 2025-12-03 21:08:25.989832121 +0000 UTC m=+0.172485041 container start 3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827 (image=quay.io/ceph/ceph:v20, name=gallant_bhabha, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:25 compute-0 podman[76428]: 2025-12-03 21:08:25.993801389 +0000 UTC m=+0.176454349 container attach 3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827 (image=quay.io/ceph/ceph:v20, name=gallant_bhabha, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:08:26 compute-0 ceph-mon[75204]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "ceph-admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:26 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:26 compute-0 ceph-mon[75204]: Set ssh ssh_user
Dec 03 21:08:26 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:26 compute-0 ceph-mon[75204]: Set ssh ssh_config
Dec 03 21:08:26 compute-0 ceph-mon[75204]: ssh user set to ceph-admin. sudo will be used
Dec 03 21:08:26 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:26 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14142 -' entity='client.admin' cmd=[{"prefix": "cephadm set-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/ssh_identity_pub}] v 0)
Dec 03 21:08:26 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:26 compute-0 ceph-mgr[75500]: [cephadm INFO root] Set ssh ssh_identity_pub
Dec 03 21:08:26 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Set ssh ssh_identity_pub
Dec 03 21:08:26 compute-0 systemd[1]: libpod-3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827.scope: Deactivated successfully.
Dec 03 21:08:26 compute-0 podman[76428]: 2025-12-03 21:08:26.405780998 +0000 UTC m=+0.588433928 container died 3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827 (image=quay.io/ceph/ceph:v20, name=gallant_bhabha, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS)
Dec 03 21:08:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-6071a846085fa60d580b71e5dcb1cedd246ac315379a174eea184c5710763a72-merged.mount: Deactivated successfully.
Dec 03 21:08:26 compute-0 podman[76428]: 2025-12-03 21:08:26.443741738 +0000 UTC m=+0.626394658 container remove 3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827 (image=quay.io/ceph/ceph:v20, name=gallant_bhabha, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:08:26 compute-0 systemd[1]: libpod-conmon-3064e6256545be95e5d1be40fb3b18af89a2666f9272a678eb6b3a81ae2f4827.scope: Deactivated successfully.
Dec 03 21:08:26 compute-0 podman[76482]: 2025-12-03 21:08:26.533189372 +0000 UTC m=+0.062939049 container create 74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61 (image=quay.io/ceph/ceph:v20, name=wizardly_elbakyan, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:08:26 compute-0 systemd[1]: Started libpod-conmon-74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61.scope.
Dec 03 21:08:26 compute-0 podman[76482]: 2025-12-03 21:08:26.506835129 +0000 UTC m=+0.036584876 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:26 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/700f99c3eb0ccc68daf51ef90e2b44453ac89cf0a2db6b7da8303fa65140866f/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/700f99c3eb0ccc68daf51ef90e2b44453ac89cf0a2db6b7da8303fa65140866f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/700f99c3eb0ccc68daf51ef90e2b44453ac89cf0a2db6b7da8303fa65140866f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:26 compute-0 podman[76482]: 2025-12-03 21:08:26.631203139 +0000 UTC m=+0.160952866 container init 74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61 (image=quay.io/ceph/ceph:v20, name=wizardly_elbakyan, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True)
Dec 03 21:08:26 compute-0 podman[76482]: 2025-12-03 21:08:26.64055385 +0000 UTC m=+0.170303557 container start 74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61 (image=quay.io/ceph/ceph:v20, name=wizardly_elbakyan, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:26 compute-0 podman[76482]: 2025-12-03 21:08:26.644239962 +0000 UTC m=+0.173989659 container attach 74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61 (image=quay.io/ceph/ceph:v20, name=wizardly_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020052782 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:08:27 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14144 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:27 compute-0 wizardly_elbakyan[76500]: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmXlK5y3Q4gaxv6M50V21Q5BxfeScA1efDlUPFJQx+7fjDW9LbjybJnwuUiZddHS7AZQRDwmbSKKxujrM5O/RgOcUubf//z1FleN0ZLzMN2Kr2gR59aLCX7I6nP+kfOWwDLCmPnnlx2ep27ttsyPvFt+E6LeBlVRn9DnMTfpoiiTjepXC8sLt6ogfOug/YhSPG6VZt2HY+eLupSup1SvQ+fP/YIzuAXPUwfRP9rehCj247OHEahfKtxzK+7b222mYPUvFVhbsVfq9ZMr2iPqRln/w4MSWo3GMwwEWZB5cNsS0qJYM9Hr5wrTQN22SOhd/BssM3SzThpbExCkObrut812OwlOJn80SkhESE0NxdQWW1tJOXVufFebcyMMrqpG9eEQvBVnVM/jkEw5epe8tMa5K8J1RPZ9xezSTHCxcgMv8ma+AxmKOAh8Dl6sBHWchaRvllX9UgXJFPcgPD2C8sEGcsJyEEyaToEr5gqYVkE8O0HpUZvOi5w8AlXNxUt0k= zuul@controller
Dec 03 21:08:27 compute-0 systemd[1]: libpod-74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61.scope: Deactivated successfully.
Dec 03 21:08:27 compute-0 podman[76482]: 2025-12-03 21:08:27.103152522 +0000 UTC m=+0.632902179 container died 74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61 (image=quay.io/ceph/ceph:v20, name=wizardly_elbakyan, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-700f99c3eb0ccc68daf51ef90e2b44453ac89cf0a2db6b7da8303fa65140866f-merged.mount: Deactivated successfully.
Dec 03 21:08:27 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:27 compute-0 podman[76482]: 2025-12-03 21:08:27.13985689 +0000 UTC m=+0.669606557 container remove 74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61 (image=quay.io/ceph/ceph:v20, name=wizardly_elbakyan, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:08:27 compute-0 systemd[1]: libpod-conmon-74293bb479b2dff6581cb68b03a3b89244feead0649162afe32fd606672f9c61.scope: Deactivated successfully.
Dec 03 21:08:27 compute-0 podman[76537]: 2025-12-03 21:08:27.210948021 +0000 UTC m=+0.048957684 container create 8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18 (image=quay.io/ceph/ceph:v20, name=vigorous_buck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 03 21:08:27 compute-0 systemd[1]: Started libpod-conmon-8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18.scope.
Dec 03 21:08:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:27 compute-0 podman[76537]: 2025-12-03 21:08:27.186038094 +0000 UTC m=+0.024047857 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fa870aef962dfd4c92a92bc1b3bb8c16f2ab8d9afe63b354477e89eeef141f1/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fa870aef962dfd4c92a92bc1b3bb8c16f2ab8d9afe63b354477e89eeef141f1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fa870aef962dfd4c92a92bc1b3bb8c16f2ab8d9afe63b354477e89eeef141f1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:27 compute-0 podman[76537]: 2025-12-03 21:08:27.293141495 +0000 UTC m=+0.131151168 container init 8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18 (image=quay.io/ceph/ceph:v20, name=vigorous_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:27 compute-0 podman[76537]: 2025-12-03 21:08:27.30184039 +0000 UTC m=+0.139850063 container start 8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18 (image=quay.io/ceph/ceph:v20, name=vigorous_buck, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:27 compute-0 podman[76537]: 2025-12-03 21:08:27.307443079 +0000 UTC m=+0.145452782 container attach 8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18 (image=quay.io/ceph/ceph:v20, name=vigorous_buck, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 03 21:08:27 compute-0 ceph-mon[75204]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "cephadm set-priv-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:27 compute-0 ceph-mon[75204]: Set ssh ssh_identity_key
Dec 03 21:08:27 compute-0 ceph-mon[75204]: Set ssh private key
Dec 03 21:08:27 compute-0 ceph-mon[75204]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "cephadm set-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:27 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:27 compute-0 ceph-mon[75204]: Set ssh ssh_identity_pub
Dec 03 21:08:27 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "compute-0", "addr": "192.168.122.100", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:27 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:27 compute-0 sshd-session[76580]: Accepted publickey for ceph-admin from 192.168.122.100 port 57312 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:27 compute-0 systemd-logind[787]: New session 20 of user ceph-admin.
Dec 03 21:08:28 compute-0 systemd[1]: Created slice User Slice of UID 42477.
Dec 03 21:08:28 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec 03 21:08:28 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec 03 21:08:28 compute-0 systemd[1]: Starting User Manager for UID 42477...
Dec 03 21:08:28 compute-0 systemd[76584]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:28 compute-0 sshd-session[76597]: Accepted publickey for ceph-admin from 192.168.122.100 port 57324 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:28 compute-0 systemd[76584]: Queued start job for default target Main User Target.
Dec 03 21:08:28 compute-0 systemd-logind[787]: New session 22 of user ceph-admin.
Dec 03 21:08:28 compute-0 systemd[76584]: Created slice User Application Slice.
Dec 03 21:08:28 compute-0 systemd[76584]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 03 21:08:28 compute-0 systemd[76584]: Started Daily Cleanup of User's Temporary Directories.
Dec 03 21:08:28 compute-0 systemd[76584]: Reached target Paths.
Dec 03 21:08:28 compute-0 systemd[76584]: Reached target Timers.
Dec 03 21:08:28 compute-0 systemd[76584]: Starting D-Bus User Message Bus Socket...
Dec 03 21:08:28 compute-0 systemd[76584]: Starting Create User's Volatile Files and Directories...
Dec 03 21:08:28 compute-0 systemd[76584]: Finished Create User's Volatile Files and Directories.
Dec 03 21:08:28 compute-0 systemd[76584]: Listening on D-Bus User Message Bus Socket.
Dec 03 21:08:28 compute-0 systemd[76584]: Reached target Sockets.
Dec 03 21:08:28 compute-0 systemd[76584]: Reached target Basic System.
Dec 03 21:08:28 compute-0 systemd[76584]: Reached target Main User Target.
Dec 03 21:08:28 compute-0 systemd[76584]: Startup finished in 168ms.
Dec 03 21:08:28 compute-0 systemd[1]: Started User Manager for UID 42477.
Dec 03 21:08:28 compute-0 systemd[1]: Started Session 20 of User ceph-admin.
Dec 03 21:08:28 compute-0 systemd[1]: Started Session 22 of User ceph-admin.
Dec 03 21:08:28 compute-0 sshd-session[76580]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:28 compute-0 sshd-session[76597]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:28 compute-0 ceph-mon[75204]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:28 compute-0 sudo[76604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:28 compute-0 sudo[76604]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:28 compute-0 sudo[76604]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:28 compute-0 sshd-session[76629]: Accepted publickey for ceph-admin from 192.168.122.100 port 57326 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:28 compute-0 systemd-logind[787]: New session 23 of user ceph-admin.
Dec 03 21:08:28 compute-0 systemd[1]: Started Session 23 of User ceph-admin.
Dec 03 21:08:28 compute-0 sshd-session[76629]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:28 compute-0 sudo[76633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 check-host --expect-hostname compute-0
Dec 03 21:08:28 compute-0 sudo[76633]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:28 compute-0 sudo[76633]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:28 compute-0 sshd-session[76658]: Accepted publickey for ceph-admin from 192.168.122.100 port 57342 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:28 compute-0 systemd-logind[787]: New session 24 of user ceph-admin.
Dec 03 21:08:28 compute-0 systemd[1]: Started Session 24 of User ceph-admin.
Dec 03 21:08:28 compute-0 sshd-session[76658]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:29 compute-0 sudo[76662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b
Dec 03 21:08:29 compute-0 sudo[76662]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:29 compute-0 sudo[76662]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:29 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Deploying cephadm binary to compute-0
Dec 03 21:08:29 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Deploying cephadm binary to compute-0
Dec 03 21:08:29 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:29 compute-0 sshd-session[76687]: Accepted publickey for ceph-admin from 192.168.122.100 port 57352 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:29 compute-0 systemd-logind[787]: New session 25 of user ceph-admin.
Dec 03 21:08:29 compute-0 systemd[1]: Started Session 25 of User ceph-admin.
Dec 03 21:08:29 compute-0 sshd-session[76687]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:29 compute-0 ceph-mon[75204]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "compute-0", "addr": "192.168.122.100", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:29 compute-0 sudo[76691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:29 compute-0 sudo[76691]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:29 compute-0 sudo[76691]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:29 compute-0 sshd-session[76716]: Accepted publickey for ceph-admin from 192.168.122.100 port 57362 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:29 compute-0 systemd-logind[787]: New session 26 of user ceph-admin.
Dec 03 21:08:29 compute-0 systemd[1]: Started Session 26 of User ceph-admin.
Dec 03 21:08:29 compute-0 sshd-session[76716]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:29 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:29 compute-0 sudo[76720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:29 compute-0 sudo[76720]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:29 compute-0 sudo[76720]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:30 compute-0 sshd-session[76745]: Accepted publickey for ceph-admin from 192.168.122.100 port 57374 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:30 compute-0 systemd-logind[787]: New session 27 of user ceph-admin.
Dec 03 21:08:30 compute-0 systemd[1]: Started Session 27 of User ceph-admin.
Dec 03 21:08:30 compute-0 sshd-session[76745]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:30 compute-0 sudo[76749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b.new
Dec 03 21:08:30 compute-0 sudo[76749]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:30 compute-0 sudo[76749]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:30 compute-0 ceph-mon[75204]: Deploying cephadm binary to compute-0
Dec 03 21:08:30 compute-0 sshd-session[76774]: Accepted publickey for ceph-admin from 192.168.122.100 port 57376 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:30 compute-0 systemd-logind[787]: New session 28 of user ceph-admin.
Dec 03 21:08:30 compute-0 systemd[1]: Started Session 28 of User ceph-admin.
Dec 03 21:08:30 compute-0 sshd-session[76774]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:30 compute-0 sudo[76778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:30 compute-0 sudo[76778]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:30 compute-0 sudo[76778]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:30 compute-0 sshd-session[76803]: Accepted publickey for ceph-admin from 192.168.122.100 port 57384 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:30 compute-0 systemd-logind[787]: New session 29 of user ceph-admin.
Dec 03 21:08:30 compute-0 systemd[1]: Started Session 29 of User ceph-admin.
Dec 03 21:08:30 compute-0 sshd-session[76803]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:30 compute-0 sudo[76807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b.new
Dec 03 21:08:30 compute-0 sudo[76807]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:30 compute-0 sudo[76807]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:31 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:31 compute-0 sshd-session[76832]: Accepted publickey for ceph-admin from 192.168.122.100 port 57396 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:31 compute-0 systemd-logind[787]: New session 30 of user ceph-admin.
Dec 03 21:08:31 compute-0 systemd[1]: Started Session 30 of User ceph-admin.
Dec 03 21:08:31 compute-0 sshd-session[76832]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054704 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:08:31 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:32 compute-0 sshd-session[76859]: Accepted publickey for ceph-admin from 192.168.122.100 port 57406 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:32 compute-0 systemd-logind[787]: New session 31 of user ceph-admin.
Dec 03 21:08:32 compute-0 systemd[1]: Started Session 31 of User ceph-admin.
Dec 03 21:08:32 compute-0 sshd-session[76859]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:32 compute-0 sudo[76863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv -Z /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b.new /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b
Dec 03 21:08:32 compute-0 sudo[76863]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:32 compute-0 sudo[76863]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:33 compute-0 sshd-session[76888]: Accepted publickey for ceph-admin from 192.168.122.100 port 57418 ssh2: RSA SHA256:kX7AX5dlReVHjwzadwRPI8+RUfTmOfALHhsJz8FVMOw
Dec 03 21:08:33 compute-0 systemd-logind[787]: New session 32 of user ceph-admin.
Dec 03 21:08:33 compute-0 systemd[1]: Started Session 32 of User ceph-admin.
Dec 03 21:08:33 compute-0 sshd-session[76888]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 03 21:08:33 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:33 compute-0 sudo[76892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 check-host --expect-hostname compute-0
Dec 03 21:08:33 compute-0 sudo[76892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:33 compute-0 sudo[76892]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Dec 03 21:08:33 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:33 compute-0 ceph-mgr[75500]: [cephadm INFO root] Added host compute-0
Dec 03 21:08:33 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Added host compute-0
Dec 03 21:08:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Dec 03 21:08:33 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:33 compute-0 vigorous_buck[76554]: Added host 'compute-0' with addr '192.168.122.100'
Dec 03 21:08:33 compute-0 systemd[1]: libpod-8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18.scope: Deactivated successfully.
Dec 03 21:08:33 compute-0 podman[76537]: 2025-12-03 21:08:33.608122145 +0000 UTC m=+6.446131888 container died 8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18 (image=quay.io/ceph/ceph:v20, name=vigorous_buck, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:08:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-5fa870aef962dfd4c92a92bc1b3bb8c16f2ab8d9afe63b354477e89eeef141f1-merged.mount: Deactivated successfully.
Dec 03 21:08:33 compute-0 sudo[76938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:33 compute-0 podman[76537]: 2025-12-03 21:08:33.672426917 +0000 UTC m=+6.510436600 container remove 8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18 (image=quay.io/ceph/ceph:v20, name=vigorous_buck, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:33 compute-0 sudo[76938]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:33 compute-0 sudo[76938]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:33 compute-0 systemd[1]: libpod-conmon-8a65f8d31b686b4902fdfda5464e97e7eef2b3bc863de964c4901dca9115be18.scope: Deactivated successfully.
Dec 03 21:08:33 compute-0 podman[76977]: 2025-12-03 21:08:33.749395023 +0000 UTC m=+0.052448619 container create 2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04 (image=quay.io/ceph/ceph:v20, name=great_antonelli, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:08:33 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:33 compute-0 sudo[76978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph:v20 --timeout 895 pull
Dec 03 21:08:33 compute-0 sudo[76978]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:33 compute-0 systemd[1]: Started libpod-conmon-2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04.scope.
Dec 03 21:08:33 compute-0 podman[76977]: 2025-12-03 21:08:33.731461689 +0000 UTC m=+0.034515315 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1f7741a6dda40c1f59190b99dc3f6fa85b43067a9795a6511e0e8def865249f/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1f7741a6dda40c1f59190b99dc3f6fa85b43067a9795a6511e0e8def865249f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1f7741a6dda40c1f59190b99dc3f6fa85b43067a9795a6511e0e8def865249f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:33 compute-0 podman[76977]: 2025-12-03 21:08:33.855253464 +0000 UTC m=+0.158307090 container init 2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04 (image=quay.io/ceph/ceph:v20, name=great_antonelli, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:33 compute-0 podman[76977]: 2025-12-03 21:08:33.867894846 +0000 UTC m=+0.170948472 container start 2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04 (image=quay.io/ceph/ceph:v20, name=great_antonelli, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:33 compute-0 podman[76977]: 2025-12-03 21:08:33.87244731 +0000 UTC m=+0.175501016 container attach 2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04 (image=quay.io/ceph/ceph:v20, name=great_antonelli, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030)
Dec 03 21:08:34 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:34 compute-0 ceph-mgr[75500]: [cephadm INFO root] Saving service mon spec with placement count:5
Dec 03 21:08:34 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Saving service mon spec with placement count:5
Dec 03 21:08:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Dec 03 21:08:34 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:34 compute-0 great_antonelli[77018]: Scheduled mon update...
Dec 03 21:08:34 compute-0 systemd[1]: libpod-2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04.scope: Deactivated successfully.
Dec 03 21:08:34 compute-0 podman[77069]: 2025-12-03 21:08:34.392612246 +0000 UTC m=+0.023008701 container died 2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04 (image=quay.io/ceph/ceph:v20, name=great_antonelli, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-e1f7741a6dda40c1f59190b99dc3f6fa85b43067a9795a6511e0e8def865249f-merged.mount: Deactivated successfully.
Dec 03 21:08:34 compute-0 podman[77069]: 2025-12-03 21:08:34.427058759 +0000 UTC m=+0.057455194 container remove 2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04 (image=quay.io/ceph/ceph:v20, name=great_antonelli, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:08:34 compute-0 systemd[1]: libpod-conmon-2e83b089b30b51a5b6596a2ee6c239c0c9e2e75ccd9ff2d4822a36750419af04.scope: Deactivated successfully.
Dec 03 21:08:34 compute-0 podman[77084]: 2025-12-03 21:08:34.511583072 +0000 UTC m=+0.055312691 container create e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473 (image=quay.io/ceph/ceph:v20, name=wizardly_yalow, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 03 21:08:34 compute-0 systemd[1]: Started libpod-conmon-e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473.scope.
Dec 03 21:08:34 compute-0 podman[77053]: 2025-12-03 21:08:34.56041186 +0000 UTC m=+0.520572748 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:34 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:34 compute-0 ceph-mon[75204]: Added host compute-0
Dec 03 21:08:34 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:08:34 compute-0 ceph-mon[75204]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:34 compute-0 ceph-mon[75204]: Saving service mon spec with placement count:5
Dec 03 21:08:34 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:34 compute-0 podman[77084]: 2025-12-03 21:08:34.485608859 +0000 UTC m=+0.029338588 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:34 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65387f9127fbc680e5ce1e823b0c634bbd8781b9a38a6f5d6d5c0c47a30512ec/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65387f9127fbc680e5ce1e823b0c634bbd8781b9a38a6f5d6d5c0c47a30512ec/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65387f9127fbc680e5ce1e823b0c634bbd8781b9a38a6f5d6d5c0c47a30512ec/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:34 compute-0 podman[77084]: 2025-12-03 21:08:34.607373133 +0000 UTC m=+0.151102842 container init e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473 (image=quay.io/ceph/ceph:v20, name=wizardly_yalow, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:08:34 compute-0 podman[77084]: 2025-12-03 21:08:34.617779811 +0000 UTC m=+0.161509430 container start e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473 (image=quay.io/ceph/ceph:v20, name=wizardly_yalow, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:34 compute-0 podman[77084]: 2025-12-03 21:08:34.62221651 +0000 UTC m=+0.165946169 container attach e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473 (image=quay.io/ceph/ceph:v20, name=wizardly_yalow, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:08:34 compute-0 podman[77116]: 2025-12-03 21:08:34.697872773 +0000 UTC m=+0.046721857 container create 0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b (image=quay.io/ceph/ceph:v20, name=strange_lovelace, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0)
Dec 03 21:08:34 compute-0 systemd[1]: Started libpod-conmon-0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b.scope.
Dec 03 21:08:34 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:34 compute-0 podman[77116]: 2025-12-03 21:08:34.679503489 +0000 UTC m=+0.028352593 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:34 compute-0 podman[77116]: 2025-12-03 21:08:34.782446557 +0000 UTC m=+0.131295681 container init 0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b (image=quay.io/ceph/ceph:v20, name=strange_lovelace, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:08:34 compute-0 podman[77116]: 2025-12-03 21:08:34.787163573 +0000 UTC m=+0.136012657 container start 0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b (image=quay.io/ceph/ceph:v20, name=strange_lovelace, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:08:34 compute-0 podman[77116]: 2025-12-03 21:08:34.791018109 +0000 UTC m=+0.139867243 container attach 0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b (image=quay.io/ceph/ceph:v20, name=strange_lovelace, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:08:34 compute-0 strange_lovelace[77140]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable)
Dec 03 21:08:34 compute-0 systemd[1]: libpod-0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b.scope: Deactivated successfully.
Dec 03 21:08:34 compute-0 podman[77116]: 2025-12-03 21:08:34.877350836 +0000 UTC m=+0.226199950 container died 0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b (image=quay.io/ceph/ceph:v20, name=strange_lovelace, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0)
Dec 03 21:08:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-0d403c9de6e192ef079128a729c2ae483103031555dcfbecb9d8ece07938e6e3-merged.mount: Deactivated successfully.
Dec 03 21:08:34 compute-0 podman[77116]: 2025-12-03 21:08:34.939796082 +0000 UTC m=+0.288645186 container remove 0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b (image=quay.io/ceph/ceph:v20, name=strange_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:08:34 compute-0 systemd[1]: libpod-conmon-0deee56b314367d338cf0346100e9219a0a47b3e8930a9340ebf59147cb7aa1b.scope: Deactivated successfully.
Dec 03 21:08:34 compute-0 sudo[76978]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=container_image}] v 0)
Dec 03 21:08:34 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:35 compute-0 sudo[77169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:35 compute-0 sudo[77169]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:35 compute-0 sudo[77169]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:35 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:35 compute-0 ceph-mgr[75500]: [cephadm INFO root] Saving service mgr spec with placement count:2
Dec 03 21:08:35 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Saving service mgr spec with placement count:2
Dec 03 21:08:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Dec 03 21:08:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:35 compute-0 wizardly_yalow[77100]: Scheduled mgr update...
Dec 03 21:08:35 compute-0 sudo[77194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 check-host
Dec 03 21:08:35 compute-0 sudo[77194]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:35 compute-0 systemd[1]: libpod-e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473.scope: Deactivated successfully.
Dec 03 21:08:35 compute-0 podman[77084]: 2025-12-03 21:08:35.124018083 +0000 UTC m=+0.667747722 container died e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473 (image=quay.io/ceph/ceph:v20, name=wizardly_yalow, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 03 21:08:35 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-65387f9127fbc680e5ce1e823b0c634bbd8781b9a38a6f5d6d5c0c47a30512ec-merged.mount: Deactivated successfully.
Dec 03 21:08:35 compute-0 podman[77084]: 2025-12-03 21:08:35.163851379 +0000 UTC m=+0.707581008 container remove e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473 (image=quay.io/ceph/ceph:v20, name=wizardly_yalow, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:35 compute-0 systemd[1]: libpod-conmon-e7d408957c2bd91ba9870cb546b3a163074fd8e41e656bbbdb6603eb7f0c5473.scope: Deactivated successfully.
Dec 03 21:08:35 compute-0 podman[77232]: 2025-12-03 21:08:35.238526517 +0000 UTC m=+0.055602397 container create 435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4 (image=quay.io/ceph/ceph:v20, name=dreamy_goodall, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:08:35 compute-0 systemd[1]: Started libpod-conmon-435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4.scope.
Dec 03 21:08:35 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:35 compute-0 podman[77232]: 2025-12-03 21:08:35.208117275 +0000 UTC m=+0.025193195 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/595a9f842ede8d6fa33fe7138cbad391cacc8d34f8163c6a3b6395a43e70ee6d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/595a9f842ede8d6fa33fe7138cbad391cacc8d34f8163c6a3b6395a43e70ee6d/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/595a9f842ede8d6fa33fe7138cbad391cacc8d34f8163c6a3b6395a43e70ee6d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:35 compute-0 podman[77232]: 2025-12-03 21:08:35.318866846 +0000 UTC m=+0.135942756 container init 435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4 (image=quay.io/ceph/ceph:v20, name=dreamy_goodall, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=tentacle, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:35 compute-0 podman[77232]: 2025-12-03 21:08:35.324742181 +0000 UTC m=+0.141818021 container start 435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4 (image=quay.io/ceph/ceph:v20, name=dreamy_goodall, CEPH_REF=tentacle, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:35 compute-0 podman[77232]: 2025-12-03 21:08:35.328403652 +0000 UTC m=+0.145479492 container attach 435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4 (image=quay.io/ceph/ceph:v20, name=dreamy_goodall, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:35 compute-0 sudo[77194]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:35 compute-0 sudo[77293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:35 compute-0 sudo[77293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:35 compute-0 sudo[77293]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:35 compute-0 sudo[77318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:08:35 compute-0 sudo[77318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:35 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:35 compute-0 ceph-mgr[75500]: [cephadm INFO root] Saving service crash spec with placement *
Dec 03 21:08:35 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Saving service crash spec with placement *
Dec 03 21:08:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0)
Dec 03 21:08:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:35 compute-0 dreamy_goodall[77248]: Scheduled crash update...
Dec 03 21:08:35 compute-0 systemd[1]: libpod-435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4.scope: Deactivated successfully.
Dec 03 21:08:35 compute-0 podman[77232]: 2025-12-03 21:08:35.739479298 +0000 UTC m=+0.556555138 container died 435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4 (image=quay.io/ceph/ceph:v20, name=dreamy_goodall, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:35 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-595a9f842ede8d6fa33fe7138cbad391cacc8d34f8163c6a3b6395a43e70ee6d-merged.mount: Deactivated successfully.
Dec 03 21:08:35 compute-0 podman[77232]: 2025-12-03 21:08:35.773033989 +0000 UTC m=+0.590109829 container remove 435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4 (image=quay.io/ceph/ceph:v20, name=dreamy_goodall, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:35 compute-0 systemd[1]: libpod-conmon-435b493dd842b04b302bc1288a801cc39422701487c6ed24bd711145141201c4.scope: Deactivated successfully.
Dec 03 21:08:35 compute-0 podman[77357]: 2025-12-03 21:08:35.849335418 +0000 UTC m=+0.051661800 container create e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae (image=quay.io/ceph/ceph:v20, name=vibrant_greider, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:35 compute-0 systemd[1]: Started libpod-conmon-e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae.scope.
Dec 03 21:08:35 compute-0 podman[77357]: 2025-12-03 21:08:35.821984971 +0000 UTC m=+0.024311433 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:35 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65dfe735af4f365273e69b76bf1edcd01459d8ac37d56ae3a310059779cca2bb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65dfe735af4f365273e69b76bf1edcd01459d8ac37d56ae3a310059779cca2bb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65dfe735af4f365273e69b76bf1edcd01459d8ac37d56ae3a310059779cca2bb/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:35 compute-0 podman[77357]: 2025-12-03 21:08:35.941009028 +0000 UTC m=+0.143335430 container init e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae (image=quay.io/ceph/ceph:v20, name=vibrant_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:35 compute-0 podman[77357]: 2025-12-03 21:08:35.953903747 +0000 UTC m=+0.156230159 container start e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae (image=quay.io/ceph/ceph:v20, name=vibrant_greider, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:35 compute-0 podman[77357]: 2025-12-03 21:08:35.958417708 +0000 UTC m=+0.160744090 container attach e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae (image=quay.io/ceph/ceph:v20, name=vibrant_greider, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:35 compute-0 ceph-mon[75204]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:35 compute-0 ceph-mon[75204]: Saving service mgr spec with placement count:2
Dec 03 21:08:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:36 compute-0 podman[77427]: 2025-12-03 21:08:36.176131268 +0000 UTC m=+0.064975539 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:36 compute-0 podman[77427]: 2025-12-03 21:08:36.304134907 +0000 UTC m=+0.192979228 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:08:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0)
Dec 03 21:08:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3131284038' entity='client.admin' 
Dec 03 21:08:36 compute-0 systemd[1]: libpod-e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae.scope: Deactivated successfully.
Dec 03 21:08:36 compute-0 podman[77357]: 2025-12-03 21:08:36.402064652 +0000 UTC m=+0.604391034 container died e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae (image=quay.io/ceph/ceph:v20, name=vibrant_greider, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-65dfe735af4f365273e69b76bf1edcd01459d8ac37d56ae3a310059779cca2bb-merged.mount: Deactivated successfully.
Dec 03 21:08:36 compute-0 podman[77357]: 2025-12-03 21:08:36.440606595 +0000 UTC m=+0.642932997 container remove e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae (image=quay.io/ceph/ceph:v20, name=vibrant_greider, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:08:36 compute-0 systemd[1]: libpod-conmon-e7da159b758f7ba92741d6da82be0c064d31ee497199417ec490eddfe9ec46ae.scope: Deactivated successfully.
Dec 03 21:08:36 compute-0 podman[77518]: 2025-12-03 21:08:36.497377161 +0000 UTC m=+0.038063124 container create ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81 (image=quay.io/ceph/ceph:v20, name=nervous_northcutt, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:36 compute-0 systemd[1]: Started libpod-conmon-ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81.scope.
Dec 03 21:08:36 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25ca0dc417f3dcc9536c4d10f622b174df9ee43be3edd695f6c465aae1f16035/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25ca0dc417f3dcc9536c4d10f622b174df9ee43be3edd695f6c465aae1f16035/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25ca0dc417f3dcc9536c4d10f622b174df9ee43be3edd695f6c465aae1f16035/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:36 compute-0 podman[77518]: 2025-12-03 21:08:36.563436436 +0000 UTC m=+0.104122429 container init ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81 (image=quay.io/ceph/ceph:v20, name=nervous_northcutt, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default)
Dec 03 21:08:36 compute-0 podman[77518]: 2025-12-03 21:08:36.570178753 +0000 UTC m=+0.110864716 container start ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81 (image=quay.io/ceph/ceph:v20, name=nervous_northcutt, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:36 compute-0 podman[77518]: 2025-12-03 21:08:36.573129946 +0000 UTC m=+0.113815939 container attach ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81 (image=quay.io/ceph/ceph:v20, name=nervous_northcutt, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:36 compute-0 sudo[77318]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:36 compute-0 podman[77518]: 2025-12-03 21:08:36.481523408 +0000 UTC m=+0.022209401 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:36 compute-0 sudo[77558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:36 compute-0 sudo[77558]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:36 compute-0 sudo[77558]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:08:36 compute-0 sudo[77583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:08:36 compute-0 sudo[77583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:37 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 77637 (sysctl)
Dec 03 21:08:37 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "label:_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/client_keyrings}] v 0)
Dec 03 21:08:37 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 03 21:08:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:37 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 03 21:08:37 compute-0 systemd[1]: libpod-ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81.scope: Deactivated successfully.
Dec 03 21:08:37 compute-0 podman[77643]: 2025-12-03 21:08:37.123660004 +0000 UTC m=+0.046128742 container died ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81 (image=quay.io/ceph/ceph:v20, name=nervous_northcutt, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:08:37 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-25ca0dc417f3dcc9536c4d10f622b174df9ee43be3edd695f6c465aae1f16035-merged.mount: Deactivated successfully.
Dec 03 21:08:37 compute-0 podman[77643]: 2025-12-03 21:08:37.172867283 +0000 UTC m=+0.095335961 container remove ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81 (image=quay.io/ceph/ceph:v20, name=nervous_northcutt, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:08:37 compute-0 systemd[1]: libpod-conmon-ce2b24863cf0f200e28b520c508f696f8695ea603a1159d7d24d2eebe52bae81.scope: Deactivated successfully.
Dec 03 21:08:37 compute-0 podman[77659]: 2025-12-03 21:08:37.267081926 +0000 UTC m=+0.056248425 container create 001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe (image=quay.io/ceph/ceph:v20, name=gracious_buck, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 03 21:08:37 compute-0 systemd[1]: Started libpod-conmon-001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe.scope.
Dec 03 21:08:37 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c0e823dd64ec187dd32439c00821873f0f129dc586cf447ef648a319f02d9a5/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c0e823dd64ec187dd32439c00821873f0f129dc586cf447ef648a319f02d9a5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c0e823dd64ec187dd32439c00821873f0f129dc586cf447ef648a319f02d9a5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:37 compute-0 podman[77659]: 2025-12-03 21:08:37.246664229 +0000 UTC m=+0.035830738 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:37 compute-0 podman[77659]: 2025-12-03 21:08:37.344327477 +0000 UTC m=+0.133494026 container init 001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe (image=quay.io/ceph/ceph:v20, name=gracious_buck, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030)
Dec 03 21:08:37 compute-0 podman[77659]: 2025-12-03 21:08:37.355819422 +0000 UTC m=+0.144985921 container start 001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe (image=quay.io/ceph/ceph:v20, name=gracious_buck, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:08:37 compute-0 podman[77659]: 2025-12-03 21:08:37.359798961 +0000 UTC m=+0.148965460 container attach 001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe (image=quay.io/ceph/ceph:v20, name=gracious_buck, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:37 compute-0 ceph-mon[75204]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:37 compute-0 ceph-mon[75204]: Saving service crash spec with placement *
Dec 03 21:08:37 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3131284038' entity='client.admin' 
Dec 03 21:08:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:37 compute-0 sudo[77583]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:37 compute-0 sudo[77716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:37 compute-0 sudo[77716]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:37 compute-0 sudo[77716]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:37 compute-0 sudo[77741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 list-networks
Dec 03 21:08:37 compute-0 sudo[77741]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:37 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:37 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14158 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "compute-0", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Dec 03 21:08:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:37 compute-0 ceph-mgr[75500]: [cephadm INFO root] Added label _admin to host compute-0
Dec 03 21:08:37 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Added label _admin to host compute-0
Dec 03 21:08:37 compute-0 gracious_buck[77680]: Added label _admin to host compute-0
Dec 03 21:08:37 compute-0 systemd[1]: libpod-001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe.scope: Deactivated successfully.
Dec 03 21:08:37 compute-0 podman[77659]: 2025-12-03 21:08:37.835357083 +0000 UTC m=+0.624523592 container died 001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe (image=quay.io/ceph/ceph:v20, name=gracious_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:08:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-8c0e823dd64ec187dd32439c00821873f0f129dc586cf447ef648a319f02d9a5-merged.mount: Deactivated successfully.
Dec 03 21:08:37 compute-0 podman[77659]: 2025-12-03 21:08:37.876084581 +0000 UTC m=+0.665251060 container remove 001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe (image=quay.io/ceph/ceph:v20, name=gracious_buck, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:37 compute-0 systemd[1]: libpod-conmon-001cad78786951b6a0ecb62d19ed5829f0bd83e1288e11966e378b5407bd86fe.scope: Deactivated successfully.
Dec 03 21:08:37 compute-0 sudo[77741]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:37 compute-0 podman[77789]: 2025-12-03 21:08:37.963850694 +0000 UTC m=+0.051207449 container create e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122 (image=quay.io/ceph/ceph:v20, name=nice_grothendieck, CEPH_REF=tentacle, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:38 compute-0 systemd[1]: Started libpod-conmon-e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122.scope.
Dec 03 21:08:38 compute-0 sudo[77809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:38 compute-0 sudo[77809]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:38 compute-0 sudo[77809]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:38 compute-0 podman[77789]: 2025-12-03 21:08:37.942723291 +0000 UTC m=+0.030080026 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:38 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7615940a10a6f245efa62cbf82fdc902280c7473a658104463abd64436ad4877/merged/etc/ceph/ceph.client.admin.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7615940a10a6f245efa62cbf82fdc902280c7473a658104463abd64436ad4877/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7615940a10a6f245efa62cbf82fdc902280c7473a658104463abd64436ad4877/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:38 compute-0 podman[77789]: 2025-12-03 21:08:38.068373512 +0000 UTC m=+0.155730247 container init e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122 (image=quay.io/ceph/ceph:v20, name=nice_grothendieck, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:38 compute-0 podman[77789]: 2025-12-03 21:08:38.076666927 +0000 UTC m=+0.164023652 container start e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122 (image=quay.io/ceph/ceph:v20, name=nice_grothendieck, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:38 compute-0 podman[77789]: 2025-12-03 21:08:38.080199794 +0000 UTC m=+0.167556559 container attach e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122 (image=quay.io/ceph/ceph:v20, name=nice_grothendieck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:08:38 compute-0 sudo[77840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- inventory --format=json-pretty --filter-for-batch
Dec 03 21:08:38 compute-0 sudo[77840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:38 compute-0 ceph-mon[75204]: from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "label:_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:38 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:38 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:38 compute-0 podman[77898]: 2025-12-03 21:08:38.435775037 +0000 UTC m=+0.044846731 container create 930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_neumann, CEPH_REF=tentacle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:38 compute-0 systemd[1]: Started libpod-conmon-930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3.scope.
Dec 03 21:08:38 compute-0 podman[77898]: 2025-12-03 21:08:38.413347912 +0000 UTC m=+0.022419646 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:08:38 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:38 compute-0 podman[77898]: 2025-12-03 21:08:38.530005539 +0000 UTC m=+0.139077253 container init 930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_neumann, ceph=True, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:38 compute-0 podman[77898]: 2025-12-03 21:08:38.537077155 +0000 UTC m=+0.146148839 container start 930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_neumann, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:08:38 compute-0 brave_neumann[77914]: 167 167
Dec 03 21:08:38 compute-0 systemd[1]: libpod-930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3.scope: Deactivated successfully.
Dec 03 21:08:38 compute-0 podman[77898]: 2025-12-03 21:08:38.540737875 +0000 UTC m=+0.149809599 container attach 930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_neumann, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:08:38 compute-0 podman[77898]: 2025-12-03 21:08:38.542000547 +0000 UTC m=+0.151072241 container died 930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_neumann, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:08:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-1fa5c6f6bb0c1e3bff4e6d42fe048791a0deb73049268a955733841eb0168f24-merged.mount: Deactivated successfully.
Dec 03 21:08:38 compute-0 podman[77898]: 2025-12-03 21:08:38.57891445 +0000 UTC m=+0.187986134 container remove 930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_neumann, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:38 compute-0 systemd[1]: libpod-conmon-930dd014193e0a3292c67c02510b44915495c89526cb1e00e3d0c396b32807b3.scope: Deactivated successfully.
Dec 03 21:08:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0)
Dec 03 21:08:38 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2346052553' entity='client.admin' 
Dec 03 21:08:38 compute-0 nice_grothendieck[77835]: set mgr/dashboard/cluster/status
Dec 03 21:08:38 compute-0 systemd[1]: libpod-e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122.scope: Deactivated successfully.
Dec 03 21:08:38 compute-0 podman[77789]: 2025-12-03 21:08:38.672701102 +0000 UTC m=+0.760057877 container died e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122 (image=quay.io/ceph/ceph:v20, name=nice_grothendieck, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:08:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-7615940a10a6f245efa62cbf82fdc902280c7473a658104463abd64436ad4877-merged.mount: Deactivated successfully.
Dec 03 21:08:38 compute-0 podman[77789]: 2025-12-03 21:08:38.721264355 +0000 UTC m=+0.808621100 container remove e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122 (image=quay.io/ceph/ceph:v20, name=nice_grothendieck, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:38 compute-0 systemd[1]: libpod-conmon-e4b89c0cea1bdba758d6b79d9c0941b9ae85d72a52442ebe13c2295736120122.scope: Deactivated successfully.
Dec 03 21:08:38 compute-0 systemd[1]: Reloading.
Dec 03 21:08:38 compute-0 systemd-rc-local-generator[77974]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:08:38 compute-0 systemd-sysv-generator[77981]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:08:39 compute-0 ceph-mgr[75500]: mgr.server send_report Not sending PG status to monitor yet, waiting for OSDs
Dec 03 21:08:39 compute-0 sudo[74111]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:39 compute-0 podman[77993]: 2025-12-03 21:08:39.348845621 +0000 UTC m=+0.073588364 container create 7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_buck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle)
Dec 03 21:08:39 compute-0 systemd[1]: Started libpod-conmon-7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182.scope.
Dec 03 21:08:39 compute-0 ceph-mon[75204]: from='client.14158 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "compute-0", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:39 compute-0 ceph-mon[75204]: Added label _admin to host compute-0
Dec 03 21:08:39 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2346052553' entity='client.admin' 
Dec 03 21:08:39 compute-0 podman[77993]: 2025-12-03 21:08:39.322317703 +0000 UTC m=+0.047060526 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:08:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d448160e9d5320d3b3e4dbb0b6b8a7e2e88348766febda51d0a93f9e9a89adb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d448160e9d5320d3b3e4dbb0b6b8a7e2e88348766febda51d0a93f9e9a89adb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d448160e9d5320d3b3e4dbb0b6b8a7e2e88348766febda51d0a93f9e9a89adb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d448160e9d5320d3b3e4dbb0b6b8a7e2e88348766febda51d0a93f9e9a89adb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:39 compute-0 sudo[78034]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psopkugpgowlmfpytrevfjuznivzwqjq ; /usr/bin/python3'
Dec 03 21:08:39 compute-0 podman[77993]: 2025-12-03 21:08:39.471099396 +0000 UTC m=+0.195842149 container init 7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_buck, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:08:39 compute-0 sudo[78034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:39 compute-0 podman[77993]: 2025-12-03 21:08:39.478411558 +0000 UTC m=+0.203154301 container start 7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_buck, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:39 compute-0 podman[77993]: 2025-12-03 21:08:39.482322404 +0000 UTC m=+0.207065157 container attach 7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_buck, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:08:39 compute-0 python3[78036]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set mgr mgr/cephadm/use_repo_digest false
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:08:39 compute-0 podman[78039]: 2025-12-03 21:08:39.673859146 +0000 UTC m=+0.041440377 container create c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17 (image=quay.io/ceph/ceph:v20, name=agitated_galois, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:08:39 compute-0 systemd[1]: Started libpod-conmon-c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17.scope.
Dec 03 21:08:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/959160143b82d12ceb61371b8def0ebe580cce413adc81a9fb578e16b97dc8af/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/959160143b82d12ceb61371b8def0ebe580cce413adc81a9fb578e16b97dc8af/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:39 compute-0 podman[78039]: 2025-12-03 21:08:39.745684334 +0000 UTC m=+0.113265655 container init c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17 (image=quay.io/ceph/ceph:v20, name=agitated_galois, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle)
Dec 03 21:08:39 compute-0 podman[78039]: 2025-12-03 21:08:39.653308647 +0000 UTC m=+0.020889898 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:39 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:39 compute-0 podman[78039]: 2025-12-03 21:08:39.757932828 +0000 UTC m=+0.125514089 container start c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17 (image=quay.io/ceph/ceph:v20, name=agitated_galois, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:08:39 compute-0 podman[78039]: 2025-12-03 21:08:39.761980917 +0000 UTC m=+0.129562148 container attach c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17 (image=quay.io/ceph/ceph:v20, name=agitated_galois, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:40 compute-0 sad_buck[78008]: [
Dec 03 21:08:40 compute-0 sad_buck[78008]:     {
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "available": false,
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "being_replaced": false,
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "ceph_device_lvm": false,
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "lsm_data": {},
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "lvs": [],
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "path": "/dev/sr0",
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "rejected_reasons": [
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "Insufficient space (<5GB)",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "Has a FileSystem"
Dec 03 21:08:40 compute-0 sad_buck[78008]:         ],
Dec 03 21:08:40 compute-0 sad_buck[78008]:         "sys_api": {
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "actuators": null,
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "device_nodes": [
Dec 03 21:08:40 compute-0 sad_buck[78008]:                 "sr0"
Dec 03 21:08:40 compute-0 sad_buck[78008]:             ],
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "devname": "sr0",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "human_readable_size": "482.00 KB",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "id_bus": "ata",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "model": "QEMU DVD-ROM",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "nr_requests": "2",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "parent": "/dev/sr0",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "partitions": {},
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "path": "/dev/sr0",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "removable": "1",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "rev": "2.5+",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "ro": "0",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "rotational": "1",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "sas_address": "",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "sas_device_handle": "",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "scheduler_mode": "mq-deadline",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "sectors": 0,
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "sectorsize": "2048",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "size": 493568.0,
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "support_discard": "2048",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "type": "disk",
Dec 03 21:08:40 compute-0 sad_buck[78008]:             "vendor": "QEMU"
Dec 03 21:08:40 compute-0 sad_buck[78008]:         }
Dec 03 21:08:40 compute-0 sad_buck[78008]:     }
Dec 03 21:08:40 compute-0 sad_buck[78008]: ]
Dec 03 21:08:40 compute-0 systemd[1]: libpod-7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182.scope: Deactivated successfully.
Dec 03 21:08:40 compute-0 conmon[78008]: conmon 7d243ddd0e6c84181635 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182.scope/container/memory.events
Dec 03 21:08:40 compute-0 podman[77993]: 2025-12-03 21:08:40.056128889 +0000 UTC m=+0.780871652 container died 7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_buck, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3)
Dec 03 21:08:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-2d448160e9d5320d3b3e4dbb0b6b8a7e2e88348766febda51d0a93f9e9a89adb-merged.mount: Deactivated successfully.
Dec 03 21:08:40 compute-0 podman[77993]: 2025-12-03 21:08:40.106730552 +0000 UTC m=+0.831473335 container remove 7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_buck, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:08:40 compute-0 systemd[1]: libpod-conmon-7d243ddd0e6c8418163555a177178b32dfa9e9d61d9a7d89bf78662d1adf4182.scope: Deactivated successfully.
Dec 03 21:08:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2409020231' entity='client.admin' 
Dec 03 21:08:40 compute-0 sudo[77840]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:40 compute-0 systemd[1]: libpod-c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17.scope: Deactivated successfully.
Dec 03 21:08:40 compute-0 podman[78039]: 2025-12-03 21:08:40.174728345 +0000 UTC m=+0.542309586 container died c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17 (image=quay.io/ceph/ceph:v20, name=agitated_galois, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} : dispatch
Dec 03 21:08:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:08:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-959160143b82d12ceb61371b8def0ebe580cce413adc81a9fb578e16b97dc8af-merged.mount: Deactivated successfully.
Dec 03 21:08:40 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Updating compute-0:/etc/ceph/ceph.conf
Dec 03 21:08:40 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Updating compute-0:/etc/ceph/ceph.conf
Dec 03 21:08:40 compute-0 podman[78039]: 2025-12-03 21:08:40.219670747 +0000 UTC m=+0.587251988 container remove c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17 (image=quay.io/ceph/ceph:v20, name=agitated_galois, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 03 21:08:40 compute-0 systemd[1]: libpod-conmon-c4cffc19d026111f1c7501b84e9375b4c8a590785a0f42ea04cbf035c05eca17.scope: Deactivated successfully.
Dec 03 21:08:40 compute-0 sudo[78034]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[78855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 03 21:08:40 compute-0 sudo[78855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[78855]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[78880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph
Dec 03 21:08:40 compute-0 sudo[78880]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[78880]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[78905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.conf.new
Dec 03 21:08:40 compute-0 sudo[78905]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[78905]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[78930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:40 compute-0 sudo[78930]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[78930]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[78955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.conf.new
Dec 03 21:08:40 compute-0 sudo[78955]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[78955]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[79052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.conf.new
Dec 03 21:08:40 compute-0 sudo[79052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[79052]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[79103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.conf.new
Dec 03 21:08:40 compute-0 sudo[79103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[79103]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[79128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv -Z /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 03 21:08:40 compute-0 sudo[79128]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[79128]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Updating compute-0:/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf
Dec 03 21:08:40 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Updating compute-0:/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf
Dec 03 21:08:40 compute-0 sudo[79154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config
Dec 03 21:08:40 compute-0 sudo[79154]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:40 compute-0 sudo[79154]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:40 compute-0 sudo[79202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config
Dec 03 21:08:40 compute-0 sudo[79202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79202]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 sudo[79298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htniwhemhphqvkcuolinzpztnuowqvft ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764796120.577372-36618-79383468006511/async_wrapper.py j236472321791 30 /home/zuul/.ansible/tmp/ansible-tmp-1764796120.577372-36618-79383468006511/AnsiballZ_command.py _'
Dec 03 21:08:41 compute-0 sudo[79298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:41 compute-0 sudo[79253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf.new
Dec 03 21:08:41 compute-0 sudo[79253]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79253]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 ceph-mgr[75500]: mgr.server send_report Giving up on OSDs that haven't reported yet, sending potentially incomplete PG state to mon
Dec 03 21:08:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:41 compute-0 ceph-mon[75204]: log_channel(cluster) log [WRN] : Health check failed: OSD count 0 < osd_pool_default_size 1 (TOO_FEW_OSDS)
Dec 03 21:08:41 compute-0 sudo[79303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:41 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2409020231' entity='client.admin' 
Dec 03 21:08:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} : dispatch
Dec 03 21:08:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:41 compute-0 ceph-mon[75204]: Updating compute-0:/etc/ceph/ceph.conf
Dec 03 21:08:41 compute-0 ceph-mon[75204]: Health check failed: OSD count 0 < osd_pool_default_size 1 (TOO_FEW_OSDS)
Dec 03 21:08:41 compute-0 sudo[79303]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79303]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 ansible-async_wrapper.py[79301]: Invoked with j236472321791 30 /home/zuul/.ansible/tmp/ansible-tmp-1764796120.577372-36618-79383468006511/AnsiballZ_command.py _
Dec 03 21:08:41 compute-0 ansible-async_wrapper.py[79353]: Starting module and watcher
Dec 03 21:08:41 compute-0 ansible-async_wrapper.py[79353]: Start watching 79354 (30)
Dec 03 21:08:41 compute-0 ansible-async_wrapper.py[79354]: Start module (79354)
Dec 03 21:08:41 compute-0 ansible-async_wrapper.py[79301]: Return async_wrapper task started.
Dec 03 21:08:41 compute-0 sudo[79328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf.new
Dec 03 21:08:41 compute-0 sudo[79328]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79298]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 sudo[79328]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 sudo[79381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf.new
Dec 03 21:08:41 compute-0 sudo[79381]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79381]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 python3[79355]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:08:41 compute-0 sudo[79407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf.new
Dec 03 21:08:41 compute-0 sudo[79407]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79407]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 podman[79406]: 2025-12-03 21:08:41.459285155 +0000 UTC m=+0.062867607 container create fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376 (image=quay.io/ceph/ceph:v20, name=bold_shirley, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:41 compute-0 systemd[1]: Started libpod-conmon-fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376.scope.
Dec 03 21:08:41 compute-0 sudo[79444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv -Z /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf.new /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf
Dec 03 21:08:41 compute-0 sudo[79444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79444]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 podman[79406]: 2025-12-03 21:08:41.432979024 +0000 UTC m=+0.036561476 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:41 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 03 21:08:41 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 03 21:08:41 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b87e353a529284a1e7e536b1ed1839b2c9c931d23cf5be0e34926e7bd9747ac/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b87e353a529284a1e7e536b1ed1839b2c9c931d23cf5be0e34926e7bd9747ac/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:41 compute-0 podman[79406]: 2025-12-03 21:08:41.56612536 +0000 UTC m=+0.169707892 container init fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376 (image=quay.io/ceph/ceph:v20, name=bold_shirley, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:08:41 compute-0 podman[79406]: 2025-12-03 21:08:41.57463442 +0000 UTC m=+0.178216862 container start fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376 (image=quay.io/ceph/ceph:v20, name=bold_shirley, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:08:41 compute-0 podman[79406]: 2025-12-03 21:08:41.578338252 +0000 UTC m=+0.181920684 container attach fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376 (image=quay.io/ceph/ceph:v20, name=bold_shirley, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:41 compute-0 sudo[79474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 03 21:08:41 compute-0 sudo[79474]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79474]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 sudo[79500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph
Dec 03 21:08:41 compute-0 sudo[79500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79500]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:08:41 compute-0 sudo[79526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.client.admin.keyring.new
Dec 03 21:08:41 compute-0 sudo[79526]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79526]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:41 compute-0 sudo[79569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:41 compute-0 sudo[79569]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79569]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:41 compute-0 sudo[79594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.client.admin.keyring.new
Dec 03 21:08:41 compute-0 sudo[79594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:41 compute-0 sudo[79594]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 sudo[79642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.client.admin.keyring.new
Dec 03 21:08:42 compute-0 sudo[79642]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14164 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:08:42 compute-0 bold_shirley[79470]: 
Dec 03 21:08:42 compute-0 bold_shirley[79470]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec 03 21:08:42 compute-0 sudo[79642]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 systemd[1]: libpod-fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376.scope: Deactivated successfully.
Dec 03 21:08:42 compute-0 podman[79406]: 2025-12-03 21:08:42.033948321 +0000 UTC m=+0.637530783 container died fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376 (image=quay.io/ceph/ceph:v20, name=bold_shirley, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-9b87e353a529284a1e7e536b1ed1839b2c9c931d23cf5be0e34926e7bd9747ac-merged.mount: Deactivated successfully.
Dec 03 21:08:42 compute-0 podman[79406]: 2025-12-03 21:08:42.075749296 +0000 UTC m=+0.679331738 container remove fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376 (image=quay.io/ceph/ceph:v20, name=bold_shirley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:08:42 compute-0 ansible-async_wrapper.py[79354]: Module complete (79354)
Dec 03 21:08:42 compute-0 systemd[1]: libpod-conmon-fccf07bccd2930f03f7d6bce7e1e1f526a0f7ac42fccee976b37a61114281376.scope: Deactivated successfully.
Dec 03 21:08:42 compute-0 sudo[79669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.client.admin.keyring.new
Dec 03 21:08:42 compute-0 sudo[79669]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79669]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 ceph-mon[75204]: Updating compute-0:/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.conf
Dec 03 21:08:42 compute-0 ceph-mon[75204]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:42 compute-0 sudo[79706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv -Z /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 03 21:08:42 compute-0 sudo[79706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79706]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Updating compute-0:/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring
Dec 03 21:08:42 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Updating compute-0:/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring
Dec 03 21:08:42 compute-0 sudo[79731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config
Dec 03 21:08:42 compute-0 sudo[79731]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79731]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 sudo[79774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config
Dec 03 21:08:42 compute-0 sudo[79774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79774]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 sudo[79804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring.new
Dec 03 21:08:42 compute-0 sudo[79804]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79804]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 sudo[79829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:42 compute-0 sudo[79829]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79829]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 sudo[79877]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqfsewrkjsxhnrblxseutawqqpbhdoyb ; /usr/bin/python3'
Dec 03 21:08:42 compute-0 sudo[79877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:42 compute-0 sudo[79878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring.new
Dec 03 21:08:42 compute-0 sudo[79878]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79878]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 python3[79884]: ansible-ansible.legacy.async_status Invoked with jid=j236472321791.79301 mode=status _async_dir=/root/.ansible_async
Dec 03 21:08:42 compute-0 sudo[79877]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 sudo[79928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring.new
Dec 03 21:08:42 compute-0 sudo[79928]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79928]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 sudo[79954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring.new
Dec 03 21:08:42 compute-0 sudo[79954]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[79954]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 sudo[80047]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aecdrbqadgvsjsaloxlqyeyerszpjbeq ; /usr/bin/python3'
Dec 03 21:08:42 compute-0 sudo[80005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv -Z /tmp/cephadm-c21de27e-a7fd-594b-8324-0697ba9aab3a/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring.new /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring
Dec 03 21:08:42 compute-0 sudo[80005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:42 compute-0 sudo[80047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:42 compute-0 sudo[80005]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:08:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:42 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev db5c9dd6-9943-4ed1-bec4-7b67ea0da67c (Updating crash deployment (+1 -> 1))
Dec 03 21:08:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Dec 03 21:08:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 03 21:08:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec 03 21:08:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:42 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Deploying daemon crash.compute-0 on compute-0
Dec 03 21:08:42 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Deploying daemon crash.compute-0 on compute-0
Dec 03 21:08:43 compute-0 sudo[80052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:43 compute-0 sudo[80052]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:43 compute-0 sudo[80052]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:43 compute-0 python3[80051]: ansible-ansible.legacy.async_status Invoked with jid=j236472321791.79301 mode=cleanup _async_dir=/root/.ansible_async
Dec 03 21:08:43 compute-0 sudo[80047]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:43 compute-0 sudo[80077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 _orch deploy --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:43 compute-0 sudo[80077]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:43 compute-0 ceph-mon[75204]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 03 21:08:43 compute-0 ceph-mon[75204]: from='client.14164 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:08:43 compute-0 ceph-mon[75204]: Updating compute-0:/var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/config/ceph.client.admin.keyring
Dec 03 21:08:43 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:43 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:43 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:43 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Dec 03 21:08:43 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec 03 21:08:43 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:43 compute-0 sudo[80143]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwlwxwehgpzxxunxxurshymrjsvyjjup ; /usr/bin/python3'
Dec 03 21:08:43 compute-0 sudo[80143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:43 compute-0 python3[80152]: ansible-ansible.builtin.stat Invoked with path=/home/ceph-admin/specs/ceph_spec.yaml follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 03 21:08:43 compute-0 podman[80172]: 2025-12-03 21:08:43.578536598 +0000 UTC m=+0.068315072 container create 1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_heisenberg, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:08:43 compute-0 sudo[80143]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:43 compute-0 systemd[1]: Started libpod-conmon-1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78.scope.
Dec 03 21:08:43 compute-0 podman[80172]: 2025-12-03 21:08:43.54830932 +0000 UTC m=+0.038087864 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:08:43 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:43 compute-0 podman[80172]: 2025-12-03 21:08:43.672261658 +0000 UTC m=+0.162040152 container init 1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_heisenberg, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:08:43 compute-0 podman[80172]: 2025-12-03 21:08:43.678712558 +0000 UTC m=+0.168491002 container start 1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_heisenberg, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:43 compute-0 podman[80172]: 2025-12-03 21:08:43.682595745 +0000 UTC m=+0.172374219 container attach 1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_heisenberg, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:43 compute-0 flamboyant_heisenberg[80191]: 167 167
Dec 03 21:08:43 compute-0 systemd[1]: libpod-1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78.scope: Deactivated successfully.
Dec 03 21:08:43 compute-0 podman[80172]: 2025-12-03 21:08:43.685275231 +0000 UTC m=+0.175053715 container died 1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_heisenberg, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-14e6289cf6278c15034ca5b774064a47f3a7949cfd34367cb71476760f5ac2a5-merged.mount: Deactivated successfully.
Dec 03 21:08:43 compute-0 podman[80172]: 2025-12-03 21:08:43.725446895 +0000 UTC m=+0.215225339 container remove 1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_heisenberg, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030)
Dec 03 21:08:43 compute-0 systemd[1]: libpod-conmon-1f193c52cc44ac034d8c4f723c4d9bd32d9a3c2eb891f196c2a1263d55976d78.scope: Deactivated successfully.
Dec 03 21:08:43 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:43 compute-0 systemd[1]: Reloading.
Dec 03 21:08:43 compute-0 systemd-rc-local-generator[80237]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:08:43 compute-0 systemd-sysv-generator[80242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:08:44 compute-0 sudo[80267]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sljgxtfqlkrvvzpkaqcvogmrjdpztoez ; /usr/bin/python3'
Dec 03 21:08:44 compute-0 sudo[80267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:44 compute-0 systemd[1]: Reloading.
Dec 03 21:08:44 compute-0 ceph-mon[75204]: Deploying daemon crash.compute-0 on compute-0
Dec 03 21:08:44 compute-0 ceph-mon[75204]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:44 compute-0 systemd-rc-local-generator[80299]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:08:44 compute-0 systemd-sysv-generator[80304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:08:44 compute-0 python3[80271]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:08:44 compute-0 podman[80310]: 2025-12-03 21:08:44.298990654 +0000 UTC m=+0.068554539 container create 7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894 (image=quay.io/ceph/ceph:v20, name=eager_khorana, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:44 compute-0 podman[80310]: 2025-12-03 21:08:44.270845277 +0000 UTC m=+0.040409172 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:44 compute-0 systemd[1]: Started libpod-conmon-7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894.scope.
Dec 03 21:08:44 compute-0 systemd[1]: Starting Ceph crash.compute-0 for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:08:44 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64babdf42d8e77669e50cba940473111ffb176c1aac908ac788366e9e917b8d0/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64babdf42d8e77669e50cba940473111ffb176c1aac908ac788366e9e917b8d0/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64babdf42d8e77669e50cba940473111ffb176c1aac908ac788366e9e917b8d0/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:44 compute-0 podman[80310]: 2025-12-03 21:08:44.432781455 +0000 UTC m=+0.202345350 container init 7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894 (image=quay.io/ceph/ceph:v20, name=eager_khorana, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:44 compute-0 podman[80310]: 2025-12-03 21:08:44.440548757 +0000 UTC m=+0.210112612 container start 7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894 (image=quay.io/ceph/ceph:v20, name=eager_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:08:44 compute-0 podman[80310]: 2025-12-03 21:08:44.444037894 +0000 UTC m=+0.213601759 container attach 7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894 (image=quay.io/ceph/ceph:v20, name=eager_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:44 compute-0 podman[80399]: 2025-12-03 21:08:44.636291673 +0000 UTC m=+0.044226076 container create 4b1e1515111c30158193d0a92c01f0c36debcabdefc45ebea0a47572a2dd93c1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:08:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b68ea16925b1ec6a0640fec20d7b7c2947c5283d548e299f57f55ceaf0ef380/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b68ea16925b1ec6a0640fec20d7b7c2947c5283d548e299f57f55ceaf0ef380/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b68ea16925b1ec6a0640fec20d7b7c2947c5283d548e299f57f55ceaf0ef380/merged/etc/ceph/ceph.client.crash.compute-0.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b68ea16925b1ec6a0640fec20d7b7c2947c5283d548e299f57f55ceaf0ef380/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:44 compute-0 podman[80399]: 2025-12-03 21:08:44.708070501 +0000 UTC m=+0.116004954 container init 4b1e1515111c30158193d0a92c01f0c36debcabdefc45ebea0a47572a2dd93c1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:08:44 compute-0 podman[80399]: 2025-12-03 21:08:44.615627632 +0000 UTC m=+0.023562055 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:08:44 compute-0 podman[80399]: 2025-12-03 21:08:44.718627391 +0000 UTC m=+0.126561814 container start 4b1e1515111c30158193d0a92c01f0c36debcabdefc45ebea0a47572a2dd93c1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:08:44 compute-0 bash[80399]: 4b1e1515111c30158193d0a92c01f0c36debcabdefc45ebea0a47572a2dd93c1
Dec 03 21:08:44 compute-0 systemd[1]: Started Ceph crash.compute-0 for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:08:44 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0[80413]: INFO:ceph-crash:pinging cluster to exercise our key
Dec 03 21:08:44 compute-0 sudo[80077]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0)
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:44 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev db5c9dd6-9943-4ed1-bec4-7b67ea0da67c (Updating crash deployment (+1 -> 1))
Dec 03 21:08:44 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event db5c9dd6-9943-4ed1-bec4-7b67ea0da67c (Updating crash deployment (+1 -> 1)) in 2 seconds
Dec 03 21:08:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.crash}] v 0)
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:44 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev 1b645393-3007-4304-b52b-a35e10c6aa55 (Updating mgr deployment (+1 -> 2))
Dec 03 21:08:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.compute-0.jdapcy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get-or-create", "entity": "mgr.compute-0.jdapcy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.jdapcy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 03 21:08:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:08:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:44 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:44 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Deploying daemon mgr.compute-0.jdapcy on compute-0
Dec 03 21:08:44 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Deploying daemon mgr.compute-0.jdapcy on compute-0
Dec 03 21:08:44 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14166 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:08:44 compute-0 eager_khorana[80328]: 
Dec 03 21:08:44 compute-0 eager_khorana[80328]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec 03 21:08:44 compute-0 systemd[1]: libpod-7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894.scope: Deactivated successfully.
Dec 03 21:08:44 compute-0 podman[80310]: 2025-12-03 21:08:44.895859109 +0000 UTC m=+0.665423034 container died 7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894 (image=quay.io/ceph/ceph:v20, name=eager_khorana, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:44 compute-0 sudo[80420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:44 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0[80413]: 2025-12-03T21:08:44.912+0000 7f35db661640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 03 21:08:44 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0[80413]: 2025-12-03T21:08:44.912+0000 7f35db661640 -1 AuthRegistry(0x7f35d4052930) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 03 21:08:44 compute-0 sudo[80420]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:44 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0[80413]: 2025-12-03T21:08:44.913+0000 7f35db661640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 03 21:08:44 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0[80413]: 2025-12-03T21:08:44.913+0000 7f35db661640 -1 AuthRegistry(0x7f35db65ffe0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 03 21:08:44 compute-0 sudo[80420]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:44 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0[80413]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec 03 21:08:44 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-crash-compute-0[80413]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec 03 21:08:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-64babdf42d8e77669e50cba940473111ffb176c1aac908ac788366e9e917b8d0-merged.mount: Deactivated successfully.
Dec 03 21:08:44 compute-0 podman[80310]: 2025-12-03 21:08:44.946476492 +0000 UTC m=+0.716040337 container remove 7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894 (image=quay.io/ceph/ceph:v20, name=eager_khorana, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:44 compute-0 sudo[80267]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:44 compute-0 systemd[1]: libpod-conmon-7f2de2732dadf5048c84c2ee2e47df3e9d7e817fc291d5a3262a708190a87894.scope: Deactivated successfully.
Dec 03 21:08:44 compute-0 sudo[80464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 _orch deploy --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:44 compute-0 sudo[80464]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:45 compute-0 sudo[80533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lysbgvfgzntdylwibuurcroouznqacfd ; /usr/bin/python3'
Dec 03 21:08:45 compute-0 sudo[80533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:45 compute-0 podman[80559]: 2025-12-03 21:08:45.426123946 +0000 UTC m=+0.055449184 container create 8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_kare, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:45 compute-0 python3[80542]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set global log_to_file true _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:08:45 compute-0 systemd[1]: Started libpod-conmon-8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011.scope.
Dec 03 21:08:45 compute-0 podman[80559]: 2025-12-03 21:08:45.395662972 +0000 UTC m=+0.024988250 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:08:45 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:45 compute-0 podman[80576]: 2025-12-03 21:08:45.528441929 +0000 UTC m=+0.057916125 container create 7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee (image=quay.io/ceph/ceph:v20, name=pedantic_stonebraker, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:45 compute-0 podman[80559]: 2025-12-03 21:08:45.5353329 +0000 UTC m=+0.164658218 container init 8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_kare, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:45 compute-0 podman[80559]: 2025-12-03 21:08:45.546737262 +0000 UTC m=+0.176062490 container start 8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_kare, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:45 compute-0 podman[80559]: 2025-12-03 21:08:45.549645464 +0000 UTC m=+0.178970732 container attach 8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_kare, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:08:45 compute-0 interesting_kare[80582]: 167 167
Dec 03 21:08:45 compute-0 systemd[1]: libpod-8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011.scope: Deactivated successfully.
Dec 03 21:08:45 compute-0 podman[80559]: 2025-12-03 21:08:45.551341756 +0000 UTC m=+0.180666994 container died 8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_kare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:08:45 compute-0 systemd[1]: Started libpod-conmon-7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee.scope.
Dec 03 21:08:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-1d3c4c9e4efce36bff79b2afb30299d002ba6cbb02340ba4ad93e45adc1c78eb-merged.mount: Deactivated successfully.
Dec 03 21:08:45 compute-0 podman[80559]: 2025-12-03 21:08:45.588425064 +0000 UTC m=+0.217750292 container remove 8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_kare, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:08:45 compute-0 podman[80576]: 2025-12-03 21:08:45.501057211 +0000 UTC m=+0.030531497 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:45 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9531cb668dcac972a4e526cdde90de7ab00350f65339d88f5e1b979e8a6ab254/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9531cb668dcac972a4e526cdde90de7ab00350f65339d88f5e1b979e8a6ab254/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9531cb668dcac972a4e526cdde90de7ab00350f65339d88f5e1b979e8a6ab254/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:45 compute-0 systemd[1]: libpod-conmon-8db86bd0acba44697c5c23d720b637d5787aed5f697ffd9bef64abe88e696011.scope: Deactivated successfully.
Dec 03 21:08:45 compute-0 podman[80576]: 2025-12-03 21:08:45.613535856 +0000 UTC m=+0.143010092 container init 7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee (image=quay.io/ceph/ceph:v20, name=pedantic_stonebraker, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:08:45 compute-0 podman[80576]: 2025-12-03 21:08:45.624172999 +0000 UTC m=+0.153647185 container start 7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee (image=quay.io/ceph/ceph:v20, name=pedantic_stonebraker, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:45 compute-0 podman[80576]: 2025-12-03 21:08:45.627987623 +0000 UTC m=+0.157461879 container attach 7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee (image=quay.io/ceph/ceph:v20, name=pedantic_stonebraker, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:45 compute-0 systemd[1]: Reloading.
Dec 03 21:08:45 compute-0 systemd-rc-local-generator[80639]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:08:45 compute-0 systemd-sysv-generator[80643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:08:45 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get-or-create", "entity": "mgr.compute-0.jdapcy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.jdapcy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:45 compute-0 ceph-mon[75204]: Deploying daemon mgr.compute-0.jdapcy on compute-0
Dec 03 21:08:45 compute-0 ceph-mon[75204]: from='client.14166 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:08:45 compute-0 ceph-mon[75204]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:45 compute-0 systemd[1]: Reloading.
Dec 03 21:08:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=log_to_file}] v 0)
Dec 03 21:08:46 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/757938590' entity='client.admin' 
Dec 03 21:08:46 compute-0 systemd-rc-local-generator[80701]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:08:46 compute-0 systemd-sysv-generator[80706]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:08:46 compute-0 podman[80576]: 2025-12-03 21:08:46.105797732 +0000 UTC m=+0.635271928 container died 7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee (image=quay.io/ceph/ceph:v20, name=pedantic_stonebraker, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:08:46 compute-0 ansible-async_wrapper.py[79353]: Done in kid B.
Dec 03 21:08:46 compute-0 systemd[1]: libpod-7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee.scope: Deactivated successfully.
Dec 03 21:08:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-9531cb668dcac972a4e526cdde90de7ab00350f65339d88f5e1b979e8a6ab254-merged.mount: Deactivated successfully.
Dec 03 21:08:46 compute-0 podman[80576]: 2025-12-03 21:08:46.277547614 +0000 UTC m=+0.807021830 container remove 7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee (image=quay.io/ceph/ceph:v20, name=pedantic_stonebraker, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, ceph=True)
Dec 03 21:08:46 compute-0 systemd[1]: Starting Ceph mgr.compute-0.jdapcy for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:08:46 compute-0 systemd[1]: libpod-conmon-7dd0eaaab2485872230f5e3286218a71930163741055fd0f0f28749a5a6b8dee.scope: Deactivated successfully.
Dec 03 21:08:46 compute-0 sudo[80533]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:46 compute-0 sudo[80777]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjxxsvkcrwjnptcvvwurhthzuitkpkpd ; /usr/bin/python3'
Dec 03 21:08:46 compute-0 sudo[80777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:46 compute-0 python3[80783]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config set global mon_cluster_log_to_file true _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:08:46 compute-0 podman[80796]: 2025-12-03 21:08:46.630298436 +0000 UTC m=+0.062422386 container create 70b4c07d62442eeb897b819626352cfaa1be138f8695a20ab1ca8ed143214483 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:08:46 compute-0 podman[80796]: 2025-12-03 21:08:46.598773486 +0000 UTC m=+0.030897526 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:08:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fed3044219a64be64de93285b6cd8db400e3b0cdcae1095ce876c7b38efed9d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fed3044219a64be64de93285b6cd8db400e3b0cdcae1095ce876c7b38efed9d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fed3044219a64be64de93285b6cd8db400e3b0cdcae1095ce876c7b38efed9d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fed3044219a64be64de93285b6cd8db400e3b0cdcae1095ce876c7b38efed9d/merged/var/lib/ceph/mgr/ceph-compute-0.jdapcy supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:08:46 compute-0 podman[80796]: 2025-12-03 21:08:46.72861032 +0000 UTC m=+0.160734300 container init 70b4c07d62442eeb897b819626352cfaa1be138f8695a20ab1ca8ed143214483 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:46 compute-0 podman[80796]: 2025-12-03 21:08:46.740880573 +0000 UTC m=+0.173004523 container start 70b4c07d62442eeb897b819626352cfaa1be138f8695a20ab1ca8ed143214483 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:46 compute-0 bash[80796]: 70b4c07d62442eeb897b819626352cfaa1be138f8695a20ab1ca8ed143214483
Dec 03 21:08:46 compute-0 podman[80809]: 2025-12-03 21:08:46.751113877 +0000 UTC m=+0.084502523 container create 64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1 (image=quay.io/ceph/ceph:v20, name=laughing_jones, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:46 compute-0 ceph-mgr[75500]: [progress INFO root] Writing back 1 completed events
Dec 03 21:08:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 03 21:08:46 compute-0 systemd[1]: Started Ceph mgr.compute-0.jdapcy for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:08:46 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:46 compute-0 systemd[1]: Started libpod-conmon-64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1.scope.
Dec 03 21:08:46 compute-0 podman[80809]: 2025-12-03 21:08:46.715402773 +0000 UTC m=+0.048812579 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:46 compute-0 ceph-mgr[80827]: set uid:gid to 167:167 (ceph:ceph)
Dec 03 21:08:46 compute-0 ceph-mgr[80827]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-mgr, pid 2
Dec 03 21:08:46 compute-0 ceph-mgr[80827]: pidfile_write: ignore empty --pid-file
Dec 03 21:08:46 compute-0 sudo[80464]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:46 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:46 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Dec 03 21:08:46 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:46 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:46 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev 1b645393-3007-4304-b52b-a35e10c6aa55 (Updating mgr deployment (+1 -> 2))
Dec 03 21:08:46 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event 1b645393-3007-4304-b52b-a35e10c6aa55 (Updating mgr deployment (+1 -> 2)) in 2 seconds
Dec 03 21:08:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Dec 03 21:08:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf98536e654af307a99fc0de9bbeb9bffc9d01cfc5c508f82a03c1f3ba7168b8/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf98536e654af307a99fc0de9bbeb9bffc9d01cfc5c508f82a03c1f3ba7168b8/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf98536e654af307a99fc0de9bbeb9bffc9d01cfc5c508f82a03c1f3ba7168b8/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:46 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:46 compute-0 podman[80809]: 2025-12-03 21:08:46.859630184 +0000 UTC m=+0.193018840 container init 64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1 (image=quay.io/ceph/ceph:v20, name=laughing_jones, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:46 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'alerts'
Dec 03 21:08:46 compute-0 podman[80809]: 2025-12-03 21:08:46.868496633 +0000 UTC m=+0.201885289 container start 64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1 (image=quay.io/ceph/ceph:v20, name=laughing_jones, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:08:46 compute-0 podman[80809]: 2025-12-03 21:08:46.871805975 +0000 UTC m=+0.205194611 container attach 64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1 (image=quay.io/ceph/ceph:v20, name=laughing_jones, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:46 compute-0 sudo[80854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:08:46 compute-0 sudo[80854]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:46 compute-0 sudo[80854]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:46 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'balancer'
Dec 03 21:08:47 compute-0 sudo[80879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:47 compute-0 sudo[80879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:47 compute-0 sudo[80879]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:47 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'cephadm'
Dec 03 21:08:47 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/757938590' entity='client.admin' 
Dec 03 21:08:47 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:47 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:47 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:47 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:47 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:47 compute-0 sudo[80923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:08:47 compute-0 sudo[80923]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mon_cluster_log_to_file}] v 0)
Dec 03 21:08:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1276199958' entity='client.admin' 
Dec 03 21:08:47 compute-0 systemd[1]: libpod-64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1.scope: Deactivated successfully.
Dec 03 21:08:47 compute-0 podman[80809]: 2025-12-03 21:08:47.32120397 +0000 UTC m=+0.654592606 container died 64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1 (image=quay.io/ceph/ceph:v20, name=laughing_jones, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:08:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-cf98536e654af307a99fc0de9bbeb9bffc9d01cfc5c508f82a03c1f3ba7168b8-merged.mount: Deactivated successfully.
Dec 03 21:08:47 compute-0 podman[80809]: 2025-12-03 21:08:47.359803455 +0000 UTC m=+0.693192091 container remove 64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1 (image=quay.io/ceph/ceph:v20, name=laughing_jones, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:08:47 compute-0 systemd[1]: libpod-conmon-64bdbc3e3538ca2f88b2304144f7243e40aa769b329fc8b0c076f1a1af6352e1.scope: Deactivated successfully.
Dec 03 21:08:47 compute-0 sudo[80777]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:47 compute-0 podman[81015]: 2025-12-03 21:08:47.517126249 +0000 UTC m=+0.048255355 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:08:47 compute-0 sudo[81060]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqarqxjftywoeeqcjpaqluwpxdensyqh ; /usr/bin/python3'
Dec 03 21:08:47 compute-0 sudo[81060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:47 compute-0 podman[81015]: 2025-12-03 21:08:47.619071034 +0000 UTC m=+0.150200140 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:47 compute-0 python3[81062]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd set-require-min-compat-client mimic
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:08:47 compute-0 podman[81087]: 2025-12-03 21:08:47.75059371 +0000 UTC m=+0.037366317 container create c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980 (image=quay.io/ceph/ceph:v20, name=frosty_jennings, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:47 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:47 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'crash'
Dec 03 21:08:47 compute-0 systemd[1]: Started libpod-conmon-c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980.scope.
Dec 03 21:08:47 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a926c7b4f2980dfffa2f0c19cf853686255e0a298843ed380ac5796969de547a/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a926c7b4f2980dfffa2f0c19cf853686255e0a298843ed380ac5796969de547a/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a926c7b4f2980dfffa2f0c19cf853686255e0a298843ed380ac5796969de547a/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:47 compute-0 podman[81087]: 2025-12-03 21:08:47.824989521 +0000 UTC m=+0.111762148 container init c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980 (image=quay.io/ceph/ceph:v20, name=frosty_jennings, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:08:47 compute-0 podman[81087]: 2025-12-03 21:08:47.831662497 +0000 UTC m=+0.118435094 container start c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980 (image=quay.io/ceph/ceph:v20, name=frosty_jennings, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:47 compute-0 podman[81087]: 2025-12-03 21:08:47.834241971 +0000 UTC m=+0.121014568 container attach c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980 (image=quay.io/ceph/ceph:v20, name=frosty_jennings, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:08:47 compute-0 podman[81087]: 2025-12-03 21:08:47.736599543 +0000 UTC m=+0.023372150 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:47 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'dashboard'
Dec 03 21:08:48 compute-0 sudo[80923]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd set-require-min-compat-client", "version": "mimic"} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/88973863' entity='client.admin' cmd={"prefix": "osd set-require-min-compat-client", "version": "mimic"} : dispatch
Dec 03 21:08:48 compute-0 sudo[81193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:08:48 compute-0 sudo[81193]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:48 compute-0 sudo[81193]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:48 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Reconfiguring mon.compute-0 (unknown last config time)...
Dec 03 21:08:48 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Reconfiguring mon.compute-0 (unknown last config time)...
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.compute-0 on compute-0
Dec 03 21:08:48 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.compute-0 on compute-0
Dec 03 21:08:48 compute-0 ceph-mon[75204]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1276199958' entity='client.admin' 
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/88973863' entity='client.admin' cmd={"prefix": "osd set-require-min-compat-client", "version": "mimic"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:48 compute-0 sudo[81219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:48 compute-0 sudo[81219]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:48 compute-0 sudo[81219]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:48 compute-0 sudo[81244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph:v20 --timeout 895 _orch deploy --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:48 compute-0 sudo[81244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:48 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'devicehealth'
Dec 03 21:08:48 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'diskprediction_local'
Dec 03 21:08:48 compute-0 podman[81283]: 2025-12-03 21:08:48.708388531 +0000 UTC m=+0.041294974 container create ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d (image=quay.io/ceph/ceph:v20, name=silly_yonath, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:48 compute-0 systemd[1]: Started libpod-conmon-ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d.scope.
Dec 03 21:08:48 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:48 compute-0 podman[81283]: 2025-12-03 21:08:48.692847495 +0000 UTC m=+0.025753968 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:48 compute-0 podman[81283]: 2025-12-03 21:08:48.834372299 +0000 UTC m=+0.167278752 container init ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d (image=quay.io/ceph/ceph:v20, name=silly_yonath, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:48 compute-0 podman[81283]: 2025-12-03 21:08:48.840954602 +0000 UTC m=+0.173861045 container start ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d (image=quay.io/ceph/ceph:v20, name=silly_yonath, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS)
Dec 03 21:08:48 compute-0 podman[81283]: 2025-12-03 21:08:48.844997822 +0000 UTC m=+0.177904285 container attach ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d (image=quay.io/ceph/ceph:v20, name=silly_yonath, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 03 21:08:48 compute-0 silly_yonath[81300]: 167 167
Dec 03 21:08:48 compute-0 systemd[1]: libpod-ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d.scope: Deactivated successfully.
Dec 03 21:08:48 compute-0 podman[81283]: 2025-12-03 21:08:48.846627722 +0000 UTC m=+0.179534165 container died ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d (image=quay.io/ceph/ceph:v20, name=silly_yonath, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:08:48 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy[80817]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 03 21:08:48 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy[80817]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 03 21:08:48 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy[80817]:   from numpy import show_config as show_numpy_config
Dec 03 21:08:48 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'influx'
Dec 03 21:08:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-83e7aee2e28b5b5ec462036bf85d3ec63a0ad824fe680b12dd8dbfbd2baf9d2c-merged.mount: Deactivated successfully.
Dec 03 21:08:48 compute-0 podman[81283]: 2025-12-03 21:08:48.900601518 +0000 UTC m=+0.233507961 container remove ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d (image=quay.io/ceph/ceph:v20, name=silly_yonath, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:48 compute-0 systemd[1]: libpod-conmon-ce76797fc46dcb62480e2b286aefba767f975233028638155ef3b6e9a967912d.scope: Deactivated successfully.
Dec 03 21:08:48 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'insights'
Dec 03 21:08:48 compute-0 sudo[81244]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:48 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Reconfiguring mgr.compute-0.jxauqt (unknown last config time)...
Dec 03 21:08:48 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Reconfiguring mgr.compute-0.jxauqt (unknown last config time)...
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.compute-0.jxauqt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get-or-create", "entity": "mgr.compute-0.jxauqt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:48 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.compute-0.jxauqt on compute-0
Dec 03 21:08:48 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.compute-0.jxauqt on compute-0
Dec 03 21:08:49 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'iostat'
Dec 03 21:08:49 compute-0 sudo[81317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:49 compute-0 sudo[81317]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:49 compute-0 sudo[81317]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:49 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'k8sevents'
Dec 03 21:08:49 compute-0 sudo[81342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph:v20 --timeout 895 _orch deploy --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:08:49 compute-0 sudo[81342]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 do_prune osdmap full prune enabled
Dec 03 21:08:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e2 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:08:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/88973863' entity='client.admin' cmd='[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]': finished
Dec 03 21:08:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e3 e3: 0 total, 0 up, 0 in
Dec 03 21:08:49 compute-0 frosty_jennings[81116]: set require_min_compat_client to mimic
Dec 03 21:08:49 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e3: 0 total, 0 up, 0 in
Dec 03 21:08:49 compute-0 systemd[1]: libpod-c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980.scope: Deactivated successfully.
Dec 03 21:08:49 compute-0 podman[81087]: 2025-12-03 21:08:49.249726011 +0000 UTC m=+1.536498598 container died c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980 (image=quay.io/ceph/ceph:v20, name=frosty_jennings, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-a926c7b4f2980dfffa2f0c19cf853686255e0a298843ed380ac5796969de547a-merged.mount: Deactivated successfully.
Dec 03 21:08:49 compute-0 podman[81087]: 2025-12-03 21:08:49.287936168 +0000 UTC m=+1.574708755 container remove c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980 (image=quay.io/ceph/ceph:v20, name=frosty_jennings, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle)
Dec 03 21:08:49 compute-0 systemd[1]: libpod-conmon-c6105edeb663c8ea74899b28822ad662efc9d2b60daa7b55178f19d8f0dbe980.scope: Deactivated successfully.
Dec 03 21:08:49 compute-0 ceph-mon[75204]: Reconfiguring mon.compute-0 (unknown last config time)...
Dec 03 21:08:49 compute-0 ceph-mon[75204]: Reconfiguring daemon mon.compute-0 on compute-0
Dec 03 21:08:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get-or-create", "entity": "mgr.compute-0.jxauqt", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Dec 03 21:08:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:08:49 compute-0 sudo[81060]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:49 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/88973863' entity='client.admin' cmd='[{"prefix": "osd set-require-min-compat-client", "version": "mimic"}]': finished
Dec 03 21:08:49 compute-0 ceph-mon[75204]: osdmap e3: 0 total, 0 up, 0 in
Dec 03 21:08:49 compute-0 podman[81395]: 2025-12-03 21:08:49.382685192 +0000 UTC m=+0.040483572 container create b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14 (image=quay.io/ceph/ceph:v20, name=laughing_meninsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:49 compute-0 systemd[1]: Started libpod-conmon-b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14.scope.
Dec 03 21:08:49 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:49 compute-0 podman[81395]: 2025-12-03 21:08:49.361899628 +0000 UTC m=+0.019698018 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:49 compute-0 podman[81395]: 2025-12-03 21:08:49.46536512 +0000 UTC m=+0.123163530 container init b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14 (image=quay.io/ceph/ceph:v20, name=laughing_meninsky, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default)
Dec 03 21:08:49 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'localpool'
Dec 03 21:08:49 compute-0 podman[81395]: 2025-12-03 21:08:49.476034274 +0000 UTC m=+0.133832654 container start b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14 (image=quay.io/ceph/ceph:v20, name=laughing_meninsky, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:49 compute-0 podman[81395]: 2025-12-03 21:08:49.479641743 +0000 UTC m=+0.137440133 container attach b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14 (image=quay.io/ceph/ceph:v20, name=laughing_meninsky, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:49 compute-0 laughing_meninsky[81411]: 167 167
Dec 03 21:08:49 compute-0 systemd[1]: libpod-b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14.scope: Deactivated successfully.
Dec 03 21:08:49 compute-0 podman[81395]: 2025-12-03 21:08:49.482396051 +0000 UTC m=+0.140194451 container died b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14 (image=quay.io/ceph/ceph:v20, name=laughing_meninsky, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-dcac2bb3d766ff155d011d0cee8b53a6be44d2f63d67d9c0b584d9d68efe6872-merged.mount: Deactivated successfully.
Dec 03 21:08:49 compute-0 podman[81395]: 2025-12-03 21:08:49.521247003 +0000 UTC m=+0.179045383 container remove b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14 (image=quay.io/ceph/ceph:v20, name=laughing_meninsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:49 compute-0 systemd[1]: libpod-conmon-b7f1ba6f374c2a01d5c1e125f274f6bb21a90173f4b4fc362acb68f1cc3f4b14.scope: Deactivated successfully.
Dec 03 21:08:49 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'mds_autoscaler'
Dec 03 21:08:49 compute-0 sudo[81342]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:49 compute-0 sudo[81427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:49 compute-0 sudo[81427]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:49 compute-0 sudo[81427]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:49 compute-0 sudo[81452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:08:49 compute-0 sudo[81452]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:49 compute-0 sudo[81500]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sixqqkfmrrqplihgkwhspxiobjjygboy ; /usr/bin/python3'
Dec 03 21:08:49 compute-0 sudo[81500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:49 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:49 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'mirroring'
Dec 03 21:08:49 compute-0 python3[81502]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch apply --in-file /home/ceph_spec.yaml _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:08:49 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'nfs'
Dec 03 21:08:50 compute-0 podman[81505]: 2025-12-03 21:08:49.953333189 +0000 UTC m=+0.044439221 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:50 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'orchestrator'
Dec 03 21:08:50 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'osd_perf_query'
Dec 03 21:08:50 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'osd_support'
Dec 03 21:08:50 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'pg_autoscaler'
Dec 03 21:08:50 compute-0 podman[81505]: 2025-12-03 21:08:50.647207467 +0000 UTC m=+0.738313439 container create 85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f (image=quay.io/ceph/ceph:v20, name=elated_moore, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:08:50 compute-0 ceph-mon[75204]: Reconfiguring mgr.compute-0.jxauqt (unknown last config time)...
Dec 03 21:08:50 compute-0 ceph-mon[75204]: Reconfiguring daemon mgr.compute-0.jxauqt on compute-0
Dec 03 21:08:50 compute-0 ceph-mon[75204]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:50 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'progress'
Dec 03 21:08:50 compute-0 systemd[1]: Started libpod-conmon-85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f.scope.
Dec 03 21:08:50 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a707decd1ffea673dc464bd073abf3aa4105b05d4ccb4c86b0a15eb5cd26607/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a707decd1ffea673dc464bd073abf3aa4105b05d4ccb4c86b0a15eb5cd26607/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a707decd1ffea673dc464bd073abf3aa4105b05d4ccb4c86b0a15eb5cd26607/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:50 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'prometheus'
Dec 03 21:08:50 compute-0 podman[81505]: 2025-12-03 21:08:50.743026289 +0000 UTC m=+0.834132251 container init 85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f (image=quay.io/ceph/ceph:v20, name=elated_moore, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:50 compute-0 podman[81505]: 2025-12-03 21:08:50.754450351 +0000 UTC m=+0.845556323 container start 85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f (image=quay.io/ceph/ceph:v20, name=elated_moore, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:08:50 compute-0 podman[81505]: 2025-12-03 21:08:50.758588484 +0000 UTC m=+0.849694426 container attach 85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f (image=quay.io/ceph/ceph:v20, name=elated_moore, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 03 21:08:50 compute-0 podman[81564]: 2025-12-03 21:08:50.865788378 +0000 UTC m=+0.063920534 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 03 21:08:50 compute-0 podman[81564]: 2025-12-03 21:08:50.982913778 +0000 UTC m=+0.181045884 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:51 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'rbd_support'
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14176 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:51 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'rgw'
Dec 03 21:08:51 compute-0 sudo[81660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:51 compute-0 sudo[81660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:51 compute-0 sudo[81660]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:51 compute-0 sudo[81703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 check-host --expect-hostname compute-0
Dec 03 21:08:51 compute-0 sudo[81703]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:51 compute-0 sudo[81452]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:08:51 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'rook'
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 sudo[81744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:08:51 compute-0 sudo[81744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:51 compute-0 sudo[81744]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:51 compute-0 sudo[81703]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e3 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [cephadm INFO root] Added host compute-0
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Added host compute-0
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [cephadm INFO root] Saving service mon spec with placement compute-0
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Saving service mon spec with placement compute-0
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [cephadm INFO root] Saving service mgr spec with placement compute-0
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Saving service mgr spec with placement compute-0
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [cephadm INFO root] Marking host: compute-0 for OSDSpec preview refresh.
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Marking host: compute-0 for OSDSpec preview refresh.
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [cephadm INFO root] Saving service osd.default_drive_group spec with placement compute-0
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Saving service osd.default_drive_group spec with placement compute-0
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.osd.default_drive_group}] v 0)
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev 1713cf37-6938-4075-be59-f0a2a6a264e6 (Updating mgr deployment (-1 -> 1))
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Removing daemon mgr.compute-0.jdapcy from compute-0 -- ports [8765]
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Removing daemon mgr.compute-0.jdapcy from compute-0 -- ports [8765]
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 elated_moore[81546]: Added host 'compute-0' with addr '192.168.122.100'
Dec 03 21:08:51 compute-0 elated_moore[81546]: Scheduled mon update...
Dec 03 21:08:51 compute-0 elated_moore[81546]: Scheduled mgr update...
Dec 03 21:08:51 compute-0 elated_moore[81546]: Scheduled osd.default_drive_group update...
Dec 03 21:08:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [progress INFO root] Writing back 2 completed events
Dec 03 21:08:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:08:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:08:51 compute-0 systemd[1]: libpod-85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f.scope: Deactivated successfully.
Dec 03 21:08:51 compute-0 sudo[81789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:51 compute-0 sudo[81789]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:51 compute-0 sudo[81789]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:51 compute-0 podman[81803]: 2025-12-03 21:08:51.823299622 +0000 UTC m=+0.029686696 container died 85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f (image=quay.io/ceph/ceph:v20, name=elated_moore, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 03 21:08:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-0a707decd1ffea673dc464bd073abf3aa4105b05d4ccb4c86b0a15eb5cd26607-merged.mount: Deactivated successfully.
Dec 03 21:08:51 compute-0 podman[81803]: 2025-12-03 21:08:51.871367822 +0000 UTC m=+0.077754896 container remove 85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f (image=quay.io/ceph/ceph:v20, name=elated_moore, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:08:51 compute-0 systemd[1]: libpod-conmon-85e2bc82a5f18c21d7834cbd6f0c0d42e81b0fb249a6ba1fd384fe452906c87f.scope: Deactivated successfully.
Dec 03 21:08:51 compute-0 sudo[81824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 rm-daemon --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --name mgr.compute-0.jdapcy --force --tcp-ports 8765
Dec 03 21:08:51 compute-0 sudo[81824]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:51 compute-0 sudo[81500]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:52 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'selftest'
Dec 03 21:08:52 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'smb'
Dec 03 21:08:52 compute-0 sudo[81890]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmfvijhpxfbvulmpndmrkaiuntvgclot ; /usr/bin/python3'
Dec 03 21:08:52 compute-0 sudo[81890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:08:52 compute-0 systemd[1]: Stopping Ceph mgr.compute-0.jdapcy for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:08:52 compute-0 python3[81895]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .osdmap.num_up_osds _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:08:52 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'snap_schedule'
Dec 03 21:08:52 compute-0 podman[81918]: 2025-12-03 21:08:52.446778476 +0000 UTC m=+0.081896138 container create 0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574 (image=quay.io/ceph/ceph:v20, name=pensive_thompson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:52 compute-0 ceph-mon[75204]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:52 compute-0 ceph-mgr[80827]: mgr[py] Loading python module 'stats'
Dec 03 21:08:52 compute-0 podman[81918]: 2025-12-03 21:08:52.411911603 +0000 UTC m=+0.047029305 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:08:52 compute-0 systemd[1]: Started libpod-conmon-0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574.scope.
Dec 03 21:08:52 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a81441c14d6d1f3a925ead8e2ee0305dddff9ca4b57890b30177026cf94ad6f/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a81441c14d6d1f3a925ead8e2ee0305dddff9ca4b57890b30177026cf94ad6f/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a81441c14d6d1f3a925ead8e2ee0305dddff9ca4b57890b30177026cf94ad6f/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:52 compute-0 podman[81918]: 2025-12-03 21:08:52.570444597 +0000 UTC m=+0.205562239 container init 0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574 (image=quay.io/ceph/ceph:v20, name=pensive_thompson, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:08:52 compute-0 podman[81939]: 2025-12-03 21:08:52.578886776 +0000 UTC m=+0.146759024 container died 70b4c07d62442eeb897b819626352cfaa1be138f8695a20ab1ca8ed143214483 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:08:52 compute-0 podman[81918]: 2025-12-03 21:08:52.58222608 +0000 UTC m=+0.217343742 container start 0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574 (image=quay.io/ceph/ceph:v20, name=pensive_thompson, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:08:52 compute-0 podman[81918]: 2025-12-03 21:08:52.594840941 +0000 UTC m=+0.229958563 container attach 0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574 (image=quay.io/ceph/ceph:v20, name=pensive_thompson, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.license=GPLv2)
Dec 03 21:08:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-4fed3044219a64be64de93285b6cd8db400e3b0cdcae1095ce876c7b38efed9d-merged.mount: Deactivated successfully.
Dec 03 21:08:52 compute-0 podman[81939]: 2025-12-03 21:08:52.636490172 +0000 UTC m=+0.204362420 container remove 70b4c07d62442eeb897b819626352cfaa1be138f8695a20ab1ca8ed143214483 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default)
Dec 03 21:08:52 compute-0 bash[81939]: ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jdapcy
Dec 03 21:08:52 compute-0 systemd[1]: ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@mgr.compute-0.jdapcy.service: Main process exited, code=exited, status=143/n/a
Dec 03 21:08:52 compute-0 systemd[1]: ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@mgr.compute-0.jdapcy.service: Failed with result 'exit-code'.
Dec 03 21:08:52 compute-0 systemd[1]: Stopped Ceph mgr.compute-0.jdapcy for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:08:52 compute-0 systemd[1]: ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@mgr.compute-0.jdapcy.service: Consumed 6.880s CPU time, 384.9M memory peak, read 0B from disk, written 216.0K to disk.
Dec 03 21:08:52 compute-0 systemd[1]: Reloading.
Dec 03 21:08:52 compute-0 systemd-rc-local-generator[82047]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:08:52 compute-0 systemd-sysv-generator[82050]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:08:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 03 21:08:53 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2135877147' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 03 21:08:53 compute-0 pensive_thompson[81956]: 
Dec 03 21:08:53 compute-0 pensive_thompson[81956]: {"fsid":"c21de27e-a7fd-594b-8324-0697ba9aab3a","health":{"status":"HEALTH_WARN","checks":{"TOO_FEW_OSDS":{"severity":"HEALTH_WARN","summary":{"message":"OSD count 0 < osd_pool_default_size 1","count":1},"muted":false}},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":51,"monmap":{"epoch":1,"min_mon_release_name":"tentacle","num_mons":1},"osdmap":{"epoch":3,"num_osds":0,"num_up_osds":0,"osd_up_since":0,"num_in_osds":0,"osd_in_since":0,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[],"num_pgs":0,"num_pools":0,"num_objects":0,"data_bytes":0,"bytes_used":0,"bytes_avail":0,"bytes_total":0},"fsmap":{"epoch":1,"btime":"2025-12-03T21:07:59:373870+0000","by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs"],"services":{}},"servicemap":{"epoch":1,"modified":"2025-12-03T21:07:59.377140+0000","services":{}},"progress_events":{}}
Dec 03 21:08:53 compute-0 systemd[1]: libpod-0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574.scope: Deactivated successfully.
Dec 03 21:08:53 compute-0 podman[81918]: 2025-12-03 21:08:53.094458529 +0000 UTC m=+0.729576191 container died 0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574 (image=quay.io/ceph/ceph:v20, name=pensive_thompson, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:08:53 compute-0 sudo[81824]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-3a81441c14d6d1f3a925ead8e2ee0305dddff9ca4b57890b30177026cf94ad6f-merged.mount: Deactivated successfully.
Dec 03 21:08:53 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.compute-0.jdapcy
Dec 03 21:08:53 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Removing key for mgr.compute-0.jdapcy
Dec 03 21:08:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth rm", "entity": "mgr.compute-0.jdapcy"} v 0)
Dec 03 21:08:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth rm", "entity": "mgr.compute-0.jdapcy"} : dispatch
Dec 03 21:08:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "auth rm", "entity": "mgr.compute-0.jdapcy"}]': finished
Dec 03 21:08:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Dec 03 21:08:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:53 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev 1713cf37-6938-4075-be59-f0a2a6a264e6 (Updating mgr deployment (-1 -> 1))
Dec 03 21:08:53 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event 1713cf37-6938-4075-be59-f0a2a6a264e6 (Updating mgr deployment (-1 -> 1)) in 1 seconds
Dec 03 21:08:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Dec 03 21:08:53 compute-0 podman[81918]: 2025-12-03 21:08:53.158149736 +0000 UTC m=+0.793267358 container remove 0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574 (image=quay.io/ceph/ceph:v20, name=pensive_thompson, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:08:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:53 compute-0 systemd[1]: libpod-conmon-0c2ab80786eb651df95ad062510362b289311eca979d4306cf2fa024bbb1d574.scope: Deactivated successfully.
Dec 03 21:08:53 compute-0 sudo[81890]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:53 compute-0 sudo[82073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:08:53 compute-0 sudo[82073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:53 compute-0 sudo[82073]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:53 compute-0 sudo[82098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:53 compute-0 sudo[82098]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:53 compute-0 sudo[82098]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:53 compute-0 sudo[82123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:08:53 compute-0 sudo[82123]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:53 compute-0 ceph-mon[75204]: Added host compute-0
Dec 03 21:08:53 compute-0 ceph-mon[75204]: Saving service mon spec with placement compute-0
Dec 03 21:08:53 compute-0 ceph-mon[75204]: Saving service mgr spec with placement compute-0
Dec 03 21:08:53 compute-0 ceph-mon[75204]: Marking host: compute-0 for OSDSpec preview refresh.
Dec 03 21:08:53 compute-0 ceph-mon[75204]: Saving service osd.default_drive_group spec with placement compute-0
Dec 03 21:08:53 compute-0 ceph-mon[75204]: Removing daemon mgr.compute-0.jdapcy from compute-0 -- ports [8765]
Dec 03 21:08:53 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2135877147' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 03 21:08:53 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth rm", "entity": "mgr.compute-0.jdapcy"} : dispatch
Dec 03 21:08:53 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "auth rm", "entity": "mgr.compute-0.jdapcy"}]': finished
Dec 03 21:08:53 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:53 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:53 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:53 compute-0 podman[82192]: 2025-12-03 21:08:53.86317252 +0000 UTC m=+0.087104958 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:08:53 compute-0 podman[82192]: 2025-12-03 21:08:53.955011393 +0000 UTC m=+0.178943801 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:08:54 compute-0 sudo[82123]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:08:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:54 compute-0 ceph-mon[75204]: Removing key for mgr.compute-0.jdapcy
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:08:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:08:54 compute-0 sudo[82289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:08:54 compute-0 sudo[82289]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:54 compute-0 sudo[82289]: pam_unix(sudo:session): session closed for user root
Dec 03 21:08:54 compute-0 sudo[82314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:08:54 compute-0 sudo[82314]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:08:54 compute-0 podman[82351]: 2025-12-03 21:08:54.913794588 +0000 UTC m=+0.059076773 container create 5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_chaplygin, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, ceph=True)
Dec 03 21:08:54 compute-0 systemd[1]: Started libpod-conmon-5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6.scope.
Dec 03 21:08:54 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:54 compute-0 podman[82351]: 2025-12-03 21:08:54.892004578 +0000 UTC m=+0.037286773 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:08:54 compute-0 podman[82351]: 2025-12-03 21:08:54.990289102 +0000 UTC m=+0.135571327 container init 5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_chaplygin, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:54 compute-0 podman[82351]: 2025-12-03 21:08:54.999718565 +0000 UTC m=+0.145000750 container start 5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_chaplygin, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:08:55 compute-0 podman[82351]: 2025-12-03 21:08:55.004078953 +0000 UTC m=+0.149361198 container attach 5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_chaplygin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 03 21:08:55 compute-0 affectionate_chaplygin[82367]: 167 167
Dec 03 21:08:55 compute-0 systemd[1]: libpod-5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6.scope: Deactivated successfully.
Dec 03 21:08:55 compute-0 podman[82351]: 2025-12-03 21:08:55.006065232 +0000 UTC m=+0.151347418 container died 5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_chaplygin, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:08:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-e6788c731b1be06689ac1472f19d689b11bf8dcc76c22a95b8dfe354d10a55e0-merged.mount: Deactivated successfully.
Dec 03 21:08:55 compute-0 podman[82351]: 2025-12-03 21:08:55.057346392 +0000 UTC m=+0.202628547 container remove 5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_chaplygin, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:08:55 compute-0 systemd[1]: libpod-conmon-5ef7e690cffb46db45e18ef861bca85ff01b1193ddae4eee257f63fc6f1ed8b6.scope: Deactivated successfully.
Dec 03 21:08:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:55 compute-0 podman[82390]: 2025-12-03 21:08:55.217060415 +0000 UTC m=+0.042363229 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:08:55 compute-0 podman[82390]: 2025-12-03 21:08:55.32225473 +0000 UTC m=+0.147557464 container create 35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_mestorf, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:55 compute-0 systemd[1]: Started libpod-conmon-35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df.scope.
Dec 03 21:08:55 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:08:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24f6de2fc2d08bcf56932009a440743ea8001b2a3fbd9773414d25e9b8a2e3bc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24f6de2fc2d08bcf56932009a440743ea8001b2a3fbd9773414d25e9b8a2e3bc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24f6de2fc2d08bcf56932009a440743ea8001b2a3fbd9773414d25e9b8a2e3bc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24f6de2fc2d08bcf56932009a440743ea8001b2a3fbd9773414d25e9b8a2e3bc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24f6de2fc2d08bcf56932009a440743ea8001b2a3fbd9773414d25e9b8a2e3bc/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:08:55 compute-0 podman[82390]: 2025-12-03 21:08:55.464010079 +0000 UTC m=+0.289312833 container init 35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_mestorf, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:08:55 compute-0 podman[82390]: 2025-12-03 21:08:55.477925673 +0000 UTC m=+0.303228437 container start 35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_mestorf, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:08:55 compute-0 podman[82390]: 2025-12-03 21:08:55.482013795 +0000 UTC m=+0.307316549 container attach 35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_mestorf, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:08:55 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 4d33bc95-baf8-481d-bc78-3b15ffd29872
Dec 03 21:08:56 compute-0 ceph-mon[75204]: pgmap v11: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e3 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:08:56 compute-0 ceph-mgr[75500]: [progress INFO root] Writing back 3 completed events
Dec 03 21:08:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 03 21:08:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "4d33bc95-baf8-481d-bc78-3b15ffd29872"} v 0)
Dec 03 21:08:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1335551350' entity='client.bootstrap-osd' cmd={"prefix": "osd new", "uuid": "4d33bc95-baf8-481d-bc78-3b15ffd29872"} : dispatch
Dec 03 21:08:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e3 do_prune osdmap full prune enabled
Dec 03 21:08:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e3 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:08:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/1335551350' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4d33bc95-baf8-481d-bc78-3b15ffd29872"}]': finished
Dec 03 21:08:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e4 e4: 1 total, 0 up, 1 in
Dec 03 21:08:56 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e4: 1 total, 0 up, 1 in
Dec 03 21:08:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:08:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:08:56 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 03 21:08:56 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Dec 03 21:08:57 compute-0 lvm[82500]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:08:57 compute-0 lvm[82500]: VG ceph_vg0 finished
Dec 03 21:08:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0)
Dec 03 21:08:57 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3432163934' entity='client.bootstrap-osd' cmd={"prefix": "mon getmap"} : dispatch
Dec 03 21:08:57 compute-0 sleepy_mestorf[82406]:  stderr: got monmap epoch 1
Dec 03 21:08:57 compute-0 sleepy_mestorf[82406]: --> Creating keyring file for osd.0
Dec 03 21:08:57 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Dec 03 21:08:57 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Dec 03 21:08:57 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 4d33bc95-baf8-481d-bc78-3b15ffd29872 --setuser ceph --setgroup ceph
Dec 03 21:08:57 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:57 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:08:57 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1335551350' entity='client.bootstrap-osd' cmd={"prefix": "osd new", "uuid": "4d33bc95-baf8-481d-bc78-3b15ffd29872"} : dispatch
Dec 03 21:08:57 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1335551350' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4d33bc95-baf8-481d-bc78-3b15ffd29872"}]': finished
Dec 03 21:08:57 compute-0 ceph-mon[75204]: osdmap e4: 1 total, 0 up, 1 in
Dec 03 21:08:57 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:08:57 compute-0 ceph-mon[75204]: pgmap v13: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:57 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3432163934' entity='client.bootstrap-osd' cmd={"prefix": "mon getmap"} : dispatch
Dec 03 21:08:57 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Dec 03 21:08:57 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]:  stderr: 2025-12-03T21:08:57.641+0000 7f1ad02228c0 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) No valid bdev label found
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]:  stderr: 2025-12-03T21:08:57.661+0000 7f1ad02228c0 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm activate successful for osd ID: 0
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:08:58 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new c4086f1b-ff53-4e63-8dc0-011238d77976
Dec 03 21:08:58 compute-0 ceph-mon[75204]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Dec 03 21:08:58 compute-0 ceph-mon[75204]: Cluster is now healthy
Dec 03 21:08:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "c4086f1b-ff53-4e63-8dc0-011238d77976"} v 0)
Dec 03 21:08:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/69530890' entity='client.bootstrap-osd' cmd={"prefix": "osd new", "uuid": "c4086f1b-ff53-4e63-8dc0-011238d77976"} : dispatch
Dec 03 21:08:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e4 do_prune osdmap full prune enabled
Dec 03 21:08:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e4 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:08:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/69530890' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c4086f1b-ff53-4e63-8dc0-011238d77976"}]': finished
Dec 03 21:08:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e5 e5: 2 total, 0 up, 2 in
Dec 03 21:08:59 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e5: 2 total, 0 up, 2 in
Dec 03 21:08:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:08:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:08:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:08:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:08:59 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:08:59 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:08:59 compute-0 lvm[83449]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:08:59 compute-0 lvm[83449]: VG ceph_vg1 finished
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-1
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-1/activate.monmap
Dec 03 21:08:59 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:08:59 compute-0 ceph-mon[75204]: pgmap v14: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:08:59 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/69530890' entity='client.bootstrap-osd' cmd={"prefix": "osd new", "uuid": "c4086f1b-ff53-4e63-8dc0-011238d77976"} : dispatch
Dec 03 21:08:59 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/69530890' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c4086f1b-ff53-4e63-8dc0-011238d77976"}]': finished
Dec 03 21:08:59 compute-0 ceph-mon[75204]: osdmap e5: 2 total, 0 up, 2 in
Dec 03 21:08:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:08:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:08:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0)
Dec 03 21:08:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/112235096' entity='client.bootstrap-osd' cmd={"prefix": "mon getmap"} : dispatch
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]:  stderr: got monmap epoch 1
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: --> Creating keyring file for osd.1
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/keyring
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1/
Dec 03 21:08:59 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 1 --monmap /var/lib/ceph/osd/ceph-1/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-1/ --osd-uuid c4086f1b-ff53-4e63-8dc0-011238d77976 --setuser ceph --setgroup ceph
Dec 03 21:09:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/112235096' entity='client.bootstrap-osd' cmd={"prefix": "mon getmap"} : dispatch
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]:  stderr: 2025-12-03T21:08:59.933+0000 7f32fb9af8c0 -1 bluestore(/var/lib/ceph/osd/ceph-1//block) No valid bdev label found
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]:  stderr: 2025-12-03T21:08:59.958+0000 7f32fb9af8c0 -1 bluestore(/var/lib/ceph/osd/ceph-1/) _read_fsid unparsable uuid
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm activate successful for osd ID: 1
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Dec 03 21:09:00 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:01 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:01 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new abcd6a67-9013-4470-978f-f75da5f33cd4
Dec 03 21:09:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v16: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd new", "uuid": "abcd6a67-9013-4470-978f-f75da5f33cd4"} v 0)
Dec 03 21:09:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4215295916' entity='client.bootstrap-osd' cmd={"prefix": "osd new", "uuid": "abcd6a67-9013-4470-978f-f75da5f33cd4"} : dispatch
Dec 03 21:09:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e5 do_prune osdmap full prune enabled
Dec 03 21:09:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e5 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:09:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/4215295916' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "abcd6a67-9013-4470-978f-f75da5f33cd4"}]': finished
Dec 03 21:09:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e6 e6: 3 total, 0 up, 3 in
Dec 03 21:09:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e6: 3 total, 0 up, 3 in
Dec 03 21:09:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:09:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:01 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:09:01 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:01 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:01 compute-0 lvm[84398]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:09:01 compute-0 lvm[84398]: VG ceph_vg2 finished
Dec 03 21:09:01 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Dec 03 21:09:01 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg2/ceph_lv2
Dec 03 21:09:01 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec 03 21:09:01 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ln -s /dev/ceph_vg2/ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:01 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Dec 03 21:09:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:01 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:09:01 compute-0 ceph-mon[75204]: pgmap v16: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:01 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4215295916' entity='client.bootstrap-osd' cmd={"prefix": "osd new", "uuid": "abcd6a67-9013-4470-978f-f75da5f33cd4"} : dispatch
Dec 03 21:09:01 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4215295916' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "abcd6a67-9013-4470-978f-f75da5f33cd4"}]': finished
Dec 03 21:09:01 compute-0 ceph-mon[75204]: osdmap e6: 3 total, 0 up, 3 in
Dec 03 21:09:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon getmap"} v 0)
Dec 03 21:09:02 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1536442231' entity='client.bootstrap-osd' cmd={"prefix": "mon getmap"} : dispatch
Dec 03 21:09:02 compute-0 sleepy_mestorf[82406]:  stderr: got monmap epoch 1
Dec 03 21:09:02 compute-0 sleepy_mestorf[82406]: --> Creating keyring file for osd.2
Dec 03 21:09:02 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Dec 03 21:09:02 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Dec 03 21:09:02 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid abcd6a67-9013-4470-978f-f75da5f33cd4 --setuser ceph --setgroup ceph
Dec 03 21:09:02 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1536442231' entity='client.bootstrap-osd' cmd={"prefix": "mon getmap"} : dispatch
Dec 03 21:09:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v18: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]:  stderr: 2025-12-03T21:09:02.309+0000 7f4160f628c0 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]:  stderr: 2025-12-03T21:09:02.325+0000 7f4160f628c0 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm prepare successful for: ceph_vg2/ceph_lv2
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg2/ceph_lv2 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/ln -snf /dev/ceph_vg2/ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm activate successful for osd ID: 2
Dec 03 21:09:03 compute-0 sleepy_mestorf[82406]: --> ceph-volume lvm create successful for: ceph_vg2/ceph_lv2
Dec 03 21:09:03 compute-0 systemd[1]: libpod-35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df.scope: Deactivated successfully.
Dec 03 21:09:03 compute-0 systemd[1]: libpod-35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df.scope: Consumed 6.485s CPU time.
Dec 03 21:09:03 compute-0 podman[85314]: 2025-12-03 21:09:03.39882178 +0000 UTC m=+0.035288055 container died 35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_mestorf, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:09:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-24f6de2fc2d08bcf56932009a440743ea8001b2a3fbd9773414d25e9b8a2e3bc-merged.mount: Deactivated successfully.
Dec 03 21:09:03 compute-0 podman[85314]: 2025-12-03 21:09:03.445650058 +0000 UTC m=+0.082116283 container remove 35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_mestorf, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 03 21:09:03 compute-0 systemd[1]: libpod-conmon-35dda402f047d2f7d6f80c4624a429643b78d5c45fd19618790fd66ea3c7f2df.scope: Deactivated successfully.
Dec 03 21:09:03 compute-0 sudo[82314]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:03 compute-0 sudo[85329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:03 compute-0 sudo[85329]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:03 compute-0 sudo[85329]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:03 compute-0 sudo[85354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:09:03 compute-0 sudo[85354]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:03 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:09:03 compute-0 ceph-mon[75204]: pgmap v18: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:03 compute-0 podman[85391]: 2025-12-03 21:09:03.994322121 +0000 UTC m=+0.044006360 container create 67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_ride, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:04 compute-0 systemd[1]: Started libpod-conmon-67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609.scope.
Dec 03 21:09:04 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:04 compute-0 podman[85391]: 2025-12-03 21:09:03.977260239 +0000 UTC m=+0.026944498 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:04 compute-0 podman[85391]: 2025-12-03 21:09:04.074056535 +0000 UTC m=+0.123740835 container init 67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_ride, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS)
Dec 03 21:09:04 compute-0 podman[85391]: 2025-12-03 21:09:04.090156843 +0000 UTC m=+0.139841092 container start 67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_ride, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030)
Dec 03 21:09:04 compute-0 podman[85391]: 2025-12-03 21:09:04.095026484 +0000 UTC m=+0.144711413 container attach 67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_ride, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:04 compute-0 goofy_ride[85407]: 167 167
Dec 03 21:09:04 compute-0 systemd[1]: libpod-67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609.scope: Deactivated successfully.
Dec 03 21:09:04 compute-0 podman[85391]: 2025-12-03 21:09:04.097616978 +0000 UTC m=+0.147301267 container died 67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_ride, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-b9a0e8c9dba25b3b4f70493f4e0a44564ba5a88f441f0ac80a354991da530240-merged.mount: Deactivated successfully.
Dec 03 21:09:04 compute-0 podman[85391]: 2025-12-03 21:09:04.140183182 +0000 UTC m=+0.189867441 container remove 67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_ride, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:09:04 compute-0 systemd[1]: libpod-conmon-67a4bf83ed5b553d4b0b43f0835e9895d2ab8b7ae068491d23cedae5b4024609.scope: Deactivated successfully.
Dec 03 21:09:04 compute-0 podman[85431]: 2025-12-03 21:09:04.374182355 +0000 UTC m=+0.069132163 container create 804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_rhodes, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:09:04 compute-0 systemd[1]: Started libpod-conmon-804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca.scope.
Dec 03 21:09:04 compute-0 podman[85431]: 2025-12-03 21:09:04.347441993 +0000 UTC m=+0.042391871 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:04 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f01e3c07af433cd59fbeb9369bff6d1c06b08f15d26e30047fc3c3da596f959/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f01e3c07af433cd59fbeb9369bff6d1c06b08f15d26e30047fc3c3da596f959/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f01e3c07af433cd59fbeb9369bff6d1c06b08f15d26e30047fc3c3da596f959/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f01e3c07af433cd59fbeb9369bff6d1c06b08f15d26e30047fc3c3da596f959/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:04 compute-0 podman[85431]: 2025-12-03 21:09:04.472757365 +0000 UTC m=+0.167707253 container init 804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_rhodes, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 03 21:09:04 compute-0 podman[85431]: 2025-12-03 21:09:04.487268434 +0000 UTC m=+0.182218262 container start 804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_rhodes, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 03 21:09:04 compute-0 podman[85431]: 2025-12-03 21:09:04.491706425 +0000 UTC m=+0.186656283 container attach 804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_rhodes, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:09:04 compute-0 objective_rhodes[85448]: {
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:     "0": [
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:         {
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "devices": [
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "/dev/loop3"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             ],
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_name": "ceph_lv0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_size": "21470642176",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "name": "ceph_lv0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "tags": {
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.crush_device_class": "",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.encrypted": "0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osd_id": "0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.type": "block",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.vdo": "0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.with_tpm": "0"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             },
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "type": "block",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "vg_name": "ceph_vg0"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:         }
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:     ],
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:     "1": [
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:         {
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "devices": [
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "/dev/loop4"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             ],
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_name": "ceph_lv1",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_size": "21470642176",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "name": "ceph_lv1",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "tags": {
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.crush_device_class": "",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.encrypted": "0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osd_id": "1",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.type": "block",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.vdo": "0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.with_tpm": "0"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             },
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "type": "block",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "vg_name": "ceph_vg1"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:         }
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:     ],
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:     "2": [
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:         {
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "devices": [
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "/dev/loop5"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             ],
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_name": "ceph_lv2",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_size": "21470642176",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "name": "ceph_lv2",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "tags": {
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.crush_device_class": "",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.encrypted": "0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osd_id": "2",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.type": "block",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.vdo": "0",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:                 "ceph.with_tpm": "0"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             },
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "type": "block",
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:             "vg_name": "ceph_vg2"
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:         }
Dec 03 21:09:04 compute-0 objective_rhodes[85448]:     ]
Dec 03 21:09:04 compute-0 objective_rhodes[85448]: }
Dec 03 21:09:04 compute-0 systemd[1]: libpod-804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca.scope: Deactivated successfully.
Dec 03 21:09:04 compute-0 podman[85431]: 2025-12-03 21:09:04.841150455 +0000 UTC m=+0.536100283 container died 804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_rhodes, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030)
Dec 03 21:09:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-0f01e3c07af433cd59fbeb9369bff6d1c06b08f15d26e30047fc3c3da596f959-merged.mount: Deactivated successfully.
Dec 03 21:09:04 compute-0 podman[85431]: 2025-12-03 21:09:04.892848225 +0000 UTC m=+0.587798023 container remove 804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_rhodes, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:04 compute-0 systemd[1]: libpod-conmon-804c274abe538c25d709c5d28deafbfe8b9559f094847d2c6f590bf62c7213ca.scope: Deactivated successfully.
Dec 03 21:09:04 compute-0 sudo[85354]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Dec 03 21:09:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 03 21:09:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:04 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:04 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Deploying daemon osd.0 on compute-0
Dec 03 21:09:04 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Deploying daemon osd.0 on compute-0
Dec 03 21:09:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Dec 03 21:09:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:05 compute-0 sudo[85469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:05 compute-0 sudo[85469]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:05 compute-0 sudo[85469]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:05 compute-0 sudo[85494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 _orch deploy --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:09:05 compute-0 sudo[85494]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v19: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:05 compute-0 podman[85561]: 2025-12-03 21:09:05.671982744 +0000 UTC m=+0.059980796 container create b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_napier, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle)
Dec 03 21:09:05 compute-0 systemd[1]: Started libpod-conmon-b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0.scope.
Dec 03 21:09:05 compute-0 podman[85561]: 2025-12-03 21:09:05.656252364 +0000 UTC m=+0.044250436 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:05 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:09:05 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:05 compute-0 podman[85561]: 2025-12-03 21:09:05.786025045 +0000 UTC m=+0.174023207 container init b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_napier, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:05 compute-0 podman[85561]: 2025-12-03 21:09:05.798222534 +0000 UTC m=+0.186220596 container start b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_napier, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:05 compute-0 podman[85561]: 2025-12-03 21:09:05.801943501 +0000 UTC m=+0.189941653 container attach b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_napier, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:05 compute-0 tender_napier[85578]: 167 167
Dec 03 21:09:05 compute-0 systemd[1]: libpod-b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0.scope: Deactivated successfully.
Dec 03 21:09:05 compute-0 podman[85561]: 2025-12-03 21:09:05.80683761 +0000 UTC m=+0.194835702 container died b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_napier, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:09:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-f4b6855a5657655f53d145eb6d8837f7c961041453902f23752303535f4d5633-merged.mount: Deactivated successfully.
Dec 03 21:09:05 compute-0 podman[85561]: 2025-12-03 21:09:05.86554161 +0000 UTC m=+0.253539702 container remove b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_napier, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030)
Dec 03 21:09:05 compute-0 systemd[1]: libpod-conmon-b3bacbb6bd5d2ac90c3ddef9f3b5d47c83f4db7dfc94823a582db0276a2528c0.scope: Deactivated successfully.
Dec 03 21:09:06 compute-0 ceph-mon[75204]: Deploying daemon osd.0 on compute-0
Dec 03 21:09:06 compute-0 ceph-mon[75204]: pgmap v19: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:06 compute-0 podman[85609]: 2025-12-03 21:09:06.172149546 +0000 UTC m=+0.065048161 container create 01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle)
Dec 03 21:09:06 compute-0 systemd[1]: Started libpod-conmon-01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e.scope.
Dec 03 21:09:06 compute-0 podman[85609]: 2025-12-03 21:09:06.146258806 +0000 UTC m=+0.039157461 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:06 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/732a454788bcef6a58ffa0718cbb7e894341dae6b3d25eba3a92d6c67062602e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/732a454788bcef6a58ffa0718cbb7e894341dae6b3d25eba3a92d6c67062602e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/732a454788bcef6a58ffa0718cbb7e894341dae6b3d25eba3a92d6c67062602e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/732a454788bcef6a58ffa0718cbb7e894341dae6b3d25eba3a92d6c67062602e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/732a454788bcef6a58ffa0718cbb7e894341dae6b3d25eba3a92d6c67062602e/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:06 compute-0 podman[85609]: 2025-12-03 21:09:06.297175881 +0000 UTC m=+0.190074556 container init 01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:09:06 compute-0 podman[85609]: 2025-12-03 21:09:06.303699335 +0000 UTC m=+0.196597950 container start 01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle)
Dec 03 21:09:06 compute-0 podman[85609]: 2025-12-03 21:09:06.307474721 +0000 UTC m=+0.200373416 container attach 01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:06 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test[85625]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec 03 21:09:06 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test[85625]:                             [--no-systemd] [--no-tmpfs]
Dec 03 21:09:06 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test[85625]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 03 21:09:06 compute-0 systemd[1]: libpod-01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e.scope: Deactivated successfully.
Dec 03 21:09:06 compute-0 podman[85609]: 2025-12-03 21:09:06.497813541 +0000 UTC m=+0.390712186 container died 01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:09:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-732a454788bcef6a58ffa0718cbb7e894341dae6b3d25eba3a92d6c67062602e-merged.mount: Deactivated successfully.
Dec 03 21:09:06 compute-0 podman[85609]: 2025-12-03 21:09:06.546265121 +0000 UTC m=+0.439163746 container remove 01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate-test, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:06 compute-0 systemd[1]: libpod-conmon-01df66932a42f22efa983977d6615177436d823b61a3cd92165e71abd955354e.scope: Deactivated successfully.
Dec 03 21:09:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e6 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:06 compute-0 systemd[1]: Reloading.
Dec 03 21:09:06 compute-0 systemd-sysv-generator[85687]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:09:06 compute-0 systemd-rc-local-generator[85683]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:09:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v20: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:07 compute-0 systemd[1]: Reloading.
Dec 03 21:09:07 compute-0 systemd-rc-local-generator[85726]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:09:07 compute-0 systemd-sysv-generator[85730]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:09:07 compute-0 systemd[1]: Starting Ceph osd.0 for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:09:07 compute-0 podman[85783]: 2025-12-03 21:09:07.676880957 +0000 UTC m=+0.051613506 container create 7ab8241e6960f866d0752598f0481dfff0a40fba60997b82e36328d4e487e60e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:07 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93aeabd5fb2036a9a60e6d6fc8084e7ed274504724adef94b91701868f95184d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93aeabd5fb2036a9a60e6d6fc8084e7ed274504724adef94b91701868f95184d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93aeabd5fb2036a9a60e6d6fc8084e7ed274504724adef94b91701868f95184d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93aeabd5fb2036a9a60e6d6fc8084e7ed274504724adef94b91701868f95184d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93aeabd5fb2036a9a60e6d6fc8084e7ed274504724adef94b91701868f95184d/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:07 compute-0 podman[85783]: 2025-12-03 21:09:07.657506251 +0000 UTC m=+0.032238830 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:07 compute-0 podman[85783]: 2025-12-03 21:09:07.754267068 +0000 UTC m=+0.128999657 container init 7ab8241e6960f866d0752598f0481dfff0a40fba60997b82e36328d4e487e60e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:07 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:09:07 compute-0 podman[85783]: 2025-12-03 21:09:07.767876476 +0000 UTC m=+0.142609055 container start 7ab8241e6960f866d0752598f0481dfff0a40fba60997b82e36328d4e487e60e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:09:07 compute-0 podman[85783]: 2025-12-03 21:09:07.772523731 +0000 UTC m=+0.147256300 container attach 7ab8241e6960f866d0752598f0481dfff0a40fba60997b82e36328d4e487e60e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:09:07 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:07 compute-0 bash[85783]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:07 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:07 compute-0 bash[85783]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:08 compute-0 ceph-mon[75204]: pgmap v20: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:08 compute-0 lvm[85886]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:09:08 compute-0 lvm[85886]: VG ceph_vg2 finished
Dec 03 21:09:08 compute-0 lvm[85887]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:09:08 compute-0 lvm[85887]: VG ceph_vg1 finished
Dec 03 21:09:08 compute-0 lvm[85883]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:09:08 compute-0 lvm[85883]: VG ceph_vg0 finished
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:08 compute-0 bash[85783]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 03 21:09:08 compute-0 bash[85783]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:08 compute-0 bash[85783]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 03 21:09:08 compute-0 bash[85783]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec 03 21:09:08 compute-0 bash[85783]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:08 compute-0 bash[85783]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:08 compute-0 bash[85783]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 03 21:09:08 compute-0 bash[85783]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 03 21:09:08 compute-0 bash[85783]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 03 21:09:08 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate[85799]: --> ceph-volume lvm activate successful for osd ID: 0
Dec 03 21:09:08 compute-0 bash[85783]: --> ceph-volume lvm activate successful for osd ID: 0
Dec 03 21:09:08 compute-0 systemd[1]: libpod-7ab8241e6960f866d0752598f0481dfff0a40fba60997b82e36328d4e487e60e.scope: Deactivated successfully.
Dec 03 21:09:08 compute-0 podman[85783]: 2025-12-03 21:09:08.950761031 +0000 UTC m=+1.325493590 container died 7ab8241e6960f866d0752598f0481dfff0a40fba60997b82e36328d4e487e60e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 03 21:09:08 compute-0 systemd[1]: libpod-7ab8241e6960f866d0752598f0481dfff0a40fba60997b82e36328d4e487e60e.scope: Consumed 1.709s CPU time.
Dec 03 21:09:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-93aeabd5fb2036a9a60e6d6fc8084e7ed274504724adef94b91701868f95184d-merged.mount: Deactivated successfully.
Dec 03 21:09:09 compute-0 podman[85783]: 2025-12-03 21:09:09.01193512 +0000 UTC m=+1.386667669 container remove 7ab8241e6960f866d0752598f0481dfff0a40fba60997b82e36328d4e487e60e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0-activate, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v21: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:09 compute-0 podman[86040]: 2025-12-03 21:09:09.289110974 +0000 UTC m=+0.052800480 container create fbaf3a19f1641818d960e065ac83d4b982a8620013c915c30fea979d7a9b5f7d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94de65b3a1b4cc1c3ae2fb2f5a4f4f678207e02cf957939c05c5f4000d0b6c8e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94de65b3a1b4cc1c3ae2fb2f5a4f4f678207e02cf957939c05c5f4000d0b6c8e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94de65b3a1b4cc1c3ae2fb2f5a4f4f678207e02cf957939c05c5f4000d0b6c8e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94de65b3a1b4cc1c3ae2fb2f5a4f4f678207e02cf957939c05c5f4000d0b6c8e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94de65b3a1b4cc1c3ae2fb2f5a4f4f678207e02cf957939c05c5f4000d0b6c8e/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:09 compute-0 podman[86040]: 2025-12-03 21:09:09.333543593 +0000 UTC m=+0.097233099 container init fbaf3a19f1641818d960e065ac83d4b982a8620013c915c30fea979d7a9b5f7d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:09 compute-0 podman[86040]: 2025-12-03 21:09:09.342368753 +0000 UTC m=+0.106058259 container start fbaf3a19f1641818d960e065ac83d4b982a8620013c915c30fea979d7a9b5f7d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=tentacle)
Dec 03 21:09:09 compute-0 bash[86040]: fbaf3a19f1641818d960e065ac83d4b982a8620013c915c30fea979d7a9b5f7d
Dec 03 21:09:09 compute-0 podman[86040]: 2025-12-03 21:09:09.263523442 +0000 UTC m=+0.027213038 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:09 compute-0 systemd[1]: Started Ceph osd.0 for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:09:09 compute-0 ceph-osd[86059]: set uid:gid to 167:167 (ceph:ceph)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-osd, pid 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: pidfile_write: ignore empty --pid-file
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 sudo[85494]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Dec 03 21:09:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:09 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Deploying daemon osd.1 on compute-0
Dec 03 21:09:09 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Deploying daemon osd.1 on compute-0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 sudo[86073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:09 compute-0 sudo[86073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:09 compute-0 sudo[86073]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 sudo[86102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 _orch deploy --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:09:09 compute-0 sudo[86102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4400 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4400 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4400 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4400 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4400 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da4000 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-osd[86059]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Dec 03 21:09:09 compute-0 ceph-osd[86059]: load: jerasure load: lrc 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-osd[86059]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 03 21:09:09 compute-0 ceph-osd[86059]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561448da5c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount shared_bdev_used = 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: RocksDB version: 7.9.2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Git sha 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Compile date 2025-10-30 15:42:43
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: DB SUMMARY
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: DB Session ID:  STVQO16ELC5LNUOQD2NX
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: CURRENT file:  CURRENT
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: IDENTITY file:  IDENTITY
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                         Options.error_if_exists: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.create_if_missing: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                         Options.paranoid_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                                     Options.env: 0x561448c35ea0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                                Options.info_log: 0x561449c868a0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_file_opening_threads: 16
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                              Options.statistics: (nil)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.use_fsync: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.max_log_file_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                         Options.allow_fallocate: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.use_direct_reads: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.create_missing_column_families: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                              Options.db_log_dir: 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                                 Options.wal_dir: db.wal
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.advise_random_on_open: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.write_buffer_manager: 0x561448c9ab40
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                            Options.rate_limiter: (nil)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.unordered_write: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.row_cache: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                              Options.wal_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.allow_ingest_behind: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.two_write_queues: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.manual_wal_flush: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.wal_compression: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.atomic_flush: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.log_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.allow_data_in_errors: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.db_host_id: __hostname__
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.max_background_jobs: 4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.max_background_compactions: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.max_subcompactions: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.max_open_files: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.bytes_per_sync: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.max_background_flushes: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Compression algorithms supported:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kZSTD supported: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kXpressCompression supported: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kBZip2Compression supported: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kLZ4Compression supported: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kZlibCompression supported: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kLZ4HCCompression supported: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kSnappyCompression supported: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c39a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c39a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c39a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d2169ef4-0915-4f24-b94a-0a07278c7229
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796149820187, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796149821971, "job": 1, "event": "recovery_finished"}
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: freelist init
Dec 03 21:09:09 compute-0 ceph-osd[86059]: freelist _read_cfg
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs umount
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) close
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bdev(0x561449a3b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluefs mount shared_bdev_used = 27262976
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: RocksDB version: 7.9.2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Git sha 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Compile date 2025-10-30 15:42:43
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: DB SUMMARY
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: DB Session ID:  STVQO16ELC5LNUOQD2NW
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: CURRENT file:  CURRENT
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: IDENTITY file:  IDENTITY
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                         Options.error_if_exists: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.create_if_missing: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                         Options.paranoid_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                                     Options.env: 0x561449e56a80
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                                Options.info_log: 0x561449c86960
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_file_opening_threads: 16
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                              Options.statistics: (nil)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.use_fsync: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.max_log_file_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                         Options.allow_fallocate: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.use_direct_reads: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.create_missing_column_families: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                              Options.db_log_dir: 
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                                 Options.wal_dir: db.wal
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.advise_random_on_open: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.write_buffer_manager: 0x561448c9b900
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                            Options.rate_limiter: (nil)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.unordered_write: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.row_cache: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                              Options.wal_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.allow_ingest_behind: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.two_write_queues: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.manual_wal_flush: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.wal_compression: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.atomic_flush: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.log_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.allow_data_in_errors: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.db_host_id: __hostname__
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.max_background_jobs: 4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.max_background_compactions: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.max_subcompactions: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.max_open_files: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.bytes_per_sync: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.max_background_flushes: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Compression algorithms supported:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kZSTD supported: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kXpressCompression supported: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kBZip2Compression supported: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kLZ4Compression supported: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kZlibCompression supported: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kLZ4HCCompression supported: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         kSnappyCompression supported: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c86bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c398d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c870c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c39a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c870c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c39a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x561449c870c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x561448c39a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d2169ef4-0915-4f24-b94a-0a07278c7229
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796149884971, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796149904165, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 131, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796149, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2169ef4-0915-4f24-b94a-0a07278c7229", "db_session_id": "STVQO16ELC5LNUOQD2NW", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796149907370, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796149, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2169ef4-0915-4f24-b94a-0a07278c7229", "db_session_id": "STVQO16ELC5LNUOQD2NW", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796149910316, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796149, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2169ef4-0915-4f24-b94a-0a07278c7229", "db_session_id": "STVQO16ELC5LNUOQD2NW", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796149912101, "job": 1, "event": "recovery_finished"}
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x561449ea0000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: DB pointer 0x561449e40000
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Dec 03 21:09:09 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:09:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.27 KB,5.62933e-05%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:09:09 compute-0 ceph-osd[86059]: <cls> /ceph/rpmbuild/BUILD/ceph-20.2.0/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 03 21:09:09 compute-0 ceph-osd[86059]: <cls> /ceph/rpmbuild/BUILD/ceph-20.2.0/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 03 21:09:09 compute-0 ceph-osd[86059]: _get_class not permitted to load lua
Dec 03 21:09:09 compute-0 ceph-osd[86059]: _get_class not permitted to load sdk
Dec 03 21:09:09 compute-0 ceph-osd[86059]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 03 21:09:09 compute-0 ceph-osd[86059]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 03 21:09:09 compute-0 ceph-osd[86059]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 03 21:09:09 compute-0 ceph-osd[86059]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 03 21:09:09 compute-0 ceph-osd[86059]: osd.0 0 load_pgs
Dec 03 21:09:09 compute-0 ceph-osd[86059]: osd.0 0 load_pgs opened 0 pgs
Dec 03 21:09:09 compute-0 ceph-osd[86059]: osd.0 0 log_to_monitors true
Dec 03 21:09:09 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0[86055]: 2025-12-03T21:09:09.943+0000 7fb861a258c0 -1 osd.0 0 log_to_monitors true
Dec 03 21:09:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]} v 0)
Dec 03 21:09:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466]' entity='osd.0' cmd={"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]} : dispatch
Dec 03 21:09:10 compute-0 podman[86592]: 2025-12-03 21:09:10.011910866 +0000 UTC m=+0.040750574 container create 7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle)
Dec 03 21:09:10 compute-0 systemd[1]: Started libpod-conmon-7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440.scope.
Dec 03 21:09:10 compute-0 podman[86592]: 2025-12-03 21:09:09.998172905 +0000 UTC m=+0.027012623 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:10 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:10 compute-0 podman[86592]: 2025-12-03 21:09:10.11388164 +0000 UTC m=+0.142721438 container init 7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_goldwasser, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:10 compute-0 podman[86592]: 2025-12-03 21:09:10.126717412 +0000 UTC m=+0.155557160 container start 7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_goldwasser, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:10 compute-0 podman[86592]: 2025-12-03 21:09:10.131173213 +0000 UTC m=+0.160013021 container attach 7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_goldwasser, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:10 compute-0 charming_goldwasser[86609]: 167 167
Dec 03 21:09:10 compute-0 systemd[1]: libpod-7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440.scope: Deactivated successfully.
Dec 03 21:09:10 compute-0 conmon[86609]: conmon 7a74ab5b6fb41a56f4de <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440.scope/container/memory.events
Dec 03 21:09:10 compute-0 podman[86592]: 2025-12-03 21:09:10.136560814 +0000 UTC m=+0.165400562 container died 7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_goldwasser, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:09:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-bdb307fa6a0c82e1628436490bed9a9efcd4cb7b79781bdf7e9fceddedc5d8c0-merged.mount: Deactivated successfully.
Dec 03 21:09:10 compute-0 podman[86592]: 2025-12-03 21:09:10.190068757 +0000 UTC m=+0.218908515 container remove 7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_goldwasser, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:10 compute-0 systemd[1]: libpod-conmon-7a74ab5b6fb41a56f4def017af9b2a6a9792b5a1cce751d3db90d7137a9f5440.scope: Deactivated successfully.
Dec 03 21:09:10 compute-0 ceph-mon[75204]: pgmap v21: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Dec 03 21:09:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:10 compute-0 ceph-mon[75204]: Deploying daemon osd.1 on compute-0
Dec 03 21:09:10 compute-0 ceph-mon[75204]: from='osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466]' entity='osd.0' cmd={"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]} : dispatch
Dec 03 21:09:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e6 do_prune osdmap full prune enabled
Dec 03 21:09:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e6 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:09:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Dec 03 21:09:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e7 e7: 3 total, 0 up, 3 in
Dec 03 21:09:10 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e7: 3 total, 0 up, 3 in
Dec 03 21:09:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0)
Dec 03 21:09:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466]' entity='osd.0' cmd={"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]} : dispatch
Dec 03 21:09:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e7 create-or-move crush item name 'osd.0' initial_weight 0.02 at location {host=compute-0,root=default}
Dec 03 21:09:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:09:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:10 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:09:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:10 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:10 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:10 compute-0 podman[86639]: 2025-12-03 21:09:10.531553066 +0000 UTC m=+0.052676338 container create 82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 03 21:09:10 compute-0 systemd[1]: Started libpod-conmon-82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6.scope.
Dec 03 21:09:10 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0813c3f1db70882f987d58f5e84731b234c624483c74d6d6aa2deccae866249d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:10 compute-0 podman[86639]: 2025-12-03 21:09:10.506728708 +0000 UTC m=+0.027852030 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0813c3f1db70882f987d58f5e84731b234c624483c74d6d6aa2deccae866249d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0813c3f1db70882f987d58f5e84731b234c624483c74d6d6aa2deccae866249d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0813c3f1db70882f987d58f5e84731b234c624483c74d6d6aa2deccae866249d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0813c3f1db70882f987d58f5e84731b234c624483c74d6d6aa2deccae866249d/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:10 compute-0 podman[86639]: 2025-12-03 21:09:10.615197755 +0000 UTC m=+0.136321007 container init 82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:10 compute-0 podman[86639]: 2025-12-03 21:09:10.627943935 +0000 UTC m=+0.149067177 container start 82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:10 compute-0 podman[86639]: 2025-12-03 21:09:10.631486928 +0000 UTC m=+0.152610170 container attach 82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:10 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test[86655]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec 03 21:09:10 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test[86655]:                             [--no-systemd] [--no-tmpfs]
Dec 03 21:09:10 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test[86655]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 03 21:09:10 compute-0 systemd[1]: libpod-82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6.scope: Deactivated successfully.
Dec 03 21:09:10 compute-0 podman[86639]: 2025-12-03 21:09:10.823841608 +0000 UTC m=+0.344964850 container died 82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-0813c3f1db70882f987d58f5e84731b234c624483c74d6d6aa2deccae866249d-merged.mount: Deactivated successfully.
Dec 03 21:09:10 compute-0 podman[86639]: 2025-12-03 21:09:10.863177152 +0000 UTC m=+0.384300384 container remove 82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate-test, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:10 compute-0 systemd[1]: libpod-conmon-82b748883acf228d74b734608c59a679b018f0ba374661caffeadbd90320a9b6.scope: Deactivated successfully.
Dec 03 21:09:10 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 03 21:09:10 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 03 21:09:11 compute-0 systemd[1]: Reloading.
Dec 03 21:09:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v23: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:11 compute-0 systemd-rc-local-generator[86716]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:09:11 compute-0 systemd-sysv-generator[86722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:09:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e7 do_prune osdmap full prune enabled
Dec 03 21:09:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e7 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:09:11 compute-0 ceph-mon[75204]: from='osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Dec 03 21:09:11 compute-0 ceph-mon[75204]: osdmap e7: 3 total, 0 up, 3 in
Dec 03 21:09:11 compute-0 ceph-mon[75204]: from='osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466]' entity='osd.0' cmd={"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]} : dispatch
Dec 03 21:09:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:11 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 03 21:09:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e8 e8: 3 total, 0 up, 3 in
Dec 03 21:09:11 compute-0 ceph-osd[86059]: osd.0 0 done with init, starting boot process
Dec 03 21:09:11 compute-0 ceph-osd[86059]: osd.0 0 start_boot
Dec 03 21:09:11 compute-0 ceph-osd[86059]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 03 21:09:11 compute-0 ceph-osd[86059]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 03 21:09:11 compute-0 ceph-osd[86059]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 03 21:09:11 compute-0 ceph-osd[86059]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 03 21:09:11 compute-0 ceph-osd[86059]: osd.0 0  bench count 12288000 bsize 4 KiB
Dec 03 21:09:11 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e8: 3 total, 0 up, 3 in
Dec 03 21:09:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:09:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:11 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:09:11 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:11 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:11 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/1181083466; not ready for session (expect reconnect)
Dec 03 21:09:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:09:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:11 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:09:11 compute-0 systemd[1]: Reloading.
Dec 03 21:09:11 compute-0 systemd-sysv-generator[86759]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:09:11 compute-0 systemd-rc-local-generator[86755]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:09:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e8 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:11 compute-0 systemd[1]: Starting Ceph osd.1 for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:09:11 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:09:12 compute-0 podman[86815]: 2025-12-03 21:09:12.102009119 +0000 UTC m=+0.073021993 container create 390b5ae1b119b449f318cfa243368a97044a30c5680633299749cead7fe8fd98 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 03 21:09:12 compute-0 podman[86815]: 2025-12-03 21:09:12.059913829 +0000 UTC m=+0.030926763 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:12 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b20058cc80da8a68b7b079bee61797b1fd9f08e659127efaf366613702c7213/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b20058cc80da8a68b7b079bee61797b1fd9f08e659127efaf366613702c7213/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b20058cc80da8a68b7b079bee61797b1fd9f08e659127efaf366613702c7213/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b20058cc80da8a68b7b079bee61797b1fd9f08e659127efaf366613702c7213/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b20058cc80da8a68b7b079bee61797b1fd9f08e659127efaf366613702c7213/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:12 compute-0 podman[86815]: 2025-12-03 21:09:12.208425164 +0000 UTC m=+0.179438068 container init 390b5ae1b119b449f318cfa243368a97044a30c5680633299749cead7fe8fd98 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:09:12 compute-0 podman[86815]: 2025-12-03 21:09:12.220172685 +0000 UTC m=+0.191185609 container start 390b5ae1b119b449f318cfa243368a97044a30c5680633299749cead7fe8fd98 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 03 21:09:12 compute-0 podman[86815]: 2025-12-03 21:09:12.231872104 +0000 UTC m=+0.202885488 container attach 390b5ae1b119b449f318cfa243368a97044a30c5680633299749cead7fe8fd98 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:12 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:12 compute-0 bash[86815]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:12 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:12 compute-0 bash[86815]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:12 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/1181083466; not ready for session (expect reconnect)
Dec 03 21:09:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:09:12 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:12 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:09:12 compute-0 ceph-mon[75204]: pgmap v23: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:12 compute-0 ceph-mon[75204]: from='osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 03 21:09:12 compute-0 ceph-mon[75204]: osdmap e8: 3 total, 0 up, 3 in
Dec 03 21:09:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:13 compute-0 lvm[86915]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:09:13 compute-0 lvm[86915]: VG ceph_vg0 finished
Dec 03 21:09:13 compute-0 lvm[86916]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:09:13 compute-0 lvm[86916]: VG ceph_vg1 finished
Dec 03 21:09:13 compute-0 lvm[86918]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:09:13 compute-0 lvm[86918]: VG ceph_vg2 finished
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:13 compute-0 bash[86815]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 03 21:09:13 compute-0 bash[86815]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:13 compute-0 bash[86815]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v25: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 03 21:09:13 compute-0 bash[86815]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Dec 03 21:09:13 compute-0 bash[86815]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-1 --no-mon-config
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:13 compute-0 bash[86815]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:13 compute-0 bash[86815]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 03 21:09:13 compute-0 bash[86815]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 03 21:09:13 compute-0 bash[86815]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1
Dec 03 21:09:13 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate[86830]: --> ceph-volume lvm activate successful for osd ID: 1
Dec 03 21:09:13 compute-0 bash[86815]: --> ceph-volume lvm activate successful for osd ID: 1
Dec 03 21:09:13 compute-0 systemd[1]: libpod-390b5ae1b119b449f318cfa243368a97044a30c5680633299749cead7fe8fd98.scope: Deactivated successfully.
Dec 03 21:09:13 compute-0 systemd[1]: libpod-390b5ae1b119b449f318cfa243368a97044a30c5680633299749cead7fe8fd98.scope: Consumed 1.686s CPU time.
Dec 03 21:09:13 compute-0 podman[86815]: 2025-12-03 21:09:13.408271445 +0000 UTC m=+1.379284359 container died 390b5ae1b119b449f318cfa243368a97044a30c5680633299749cead7fe8fd98 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:13 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/1181083466; not ready for session (expect reconnect)
Dec 03 21:09:13 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:09:13 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:13 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:09:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-5b20058cc80da8a68b7b079bee61797b1fd9f08e659127efaf366613702c7213-merged.mount: Deactivated successfully.
Dec 03 21:09:13 compute-0 ceph-mon[75204]: purged_snaps scrub starts
Dec 03 21:09:13 compute-0 ceph-mon[75204]: purged_snaps scrub ok
Dec 03 21:09:13 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:13 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:13 compute-0 podman[86815]: 2025-12-03 21:09:13.517689981 +0000 UTC m=+1.488702875 container remove 390b5ae1b119b449f318cfa243368a97044a30c5680633299749cead7fe8fd98 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1-activate, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS)
Dec 03 21:09:13 compute-0 ceph-mgr[75500]: [devicehealth WARNING root] not enough osds to create mgr pool
Dec 03 21:09:13 compute-0 podman[87075]: 2025-12-03 21:09:13.806032614 +0000 UTC m=+0.062242163 container create 947e483d8391b48b3468ce765508e3878efe6194e4c13a9b9406fbc1894cb209 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:13 compute-0 podman[87075]: 2025-12-03 21:09:13.771027228 +0000 UTC m=+0.027236787 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24ff7bf7fd4b63083be2563f84919eef2a779ac57fa3c5dd2cd4a0d37e55d871/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24ff7bf7fd4b63083be2563f84919eef2a779ac57fa3c5dd2cd4a0d37e55d871/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24ff7bf7fd4b63083be2563f84919eef2a779ac57fa3c5dd2cd4a0d37e55d871/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24ff7bf7fd4b63083be2563f84919eef2a779ac57fa3c5dd2cd4a0d37e55d871/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24ff7bf7fd4b63083be2563f84919eef2a779ac57fa3c5dd2cd4a0d37e55d871/merged/var/lib/ceph/osd/ceph-1 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:14 compute-0 podman[87075]: 2025-12-03 21:09:14.03978691 +0000 UTC m=+0.295996439 container init 947e483d8391b48b3468ce765508e3878efe6194e4c13a9b9406fbc1894cb209 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:14 compute-0 podman[87075]: 2025-12-03 21:09:14.048652601 +0000 UTC m=+0.304862110 container start 947e483d8391b48b3468ce765508e3878efe6194e4c13a9b9406fbc1894cb209 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:09:14 compute-0 bash[87075]: 947e483d8391b48b3468ce765508e3878efe6194e4c13a9b9406fbc1894cb209
Dec 03 21:09:14 compute-0 systemd[1]: Started Ceph osd.1 for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:09:14 compute-0 ceph-osd[87094]: set uid:gid to 167:167 (ceph:ceph)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-osd, pid 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: pidfile_write: ignore empty --pid-file
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 sudo[86102]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0)
Dec 03 21:09:14 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 03 21:09:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:14 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:14 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Deploying daemon osd.2 on compute-0
Dec 03 21:09:14 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Deploying daemon osd.2 on compute-0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040400 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040400 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040400 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040400 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040400 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 sudo[87111]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:14 compute-0 sudo[87111]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c040000 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 sudo[87111]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:14 compute-0 ceph-osd[87094]: starting osd.1 osd_data /var/lib/ceph/osd/ceph-1 /var/lib/ceph/osd/ceph-1/journal
Dec 03 21:09:14 compute-0 ceph-osd[87094]: load: jerasure load: lrc 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 sudo[87145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 _orch deploy --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:09:14 compute-0 sudo[87145]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-osd[87094]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 03 21:09:14 compute-0 ceph-osd[87094]: osd.1:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 36.196 iops: 9266.164 elapsed_sec: 0.324
Dec 03 21:09:14 compute-0 ceph-osd[86059]: log_channel(cluster) log [WRN] : OSD bench result of 9266.164411 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 0 waiting for initial osdmap
Dec 03 21:09:14 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0[86055]: 2025-12-03T21:09:14.417+0000 7fb85d9a7640 -1 osd.0 0 waiting for initial osdmap
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 8 crush map has features 288514050185494528, adjusting msgr requires for clients
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 8 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 8 crush map has features 3314932999778484224, adjusting msgr requires for osds
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 8 check_osdmap_features require_osd_release unknown -> tentacle
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1c041c00 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount shared_bdev_used = 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: RocksDB version: 7.9.2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Git sha 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Compile date 2025-10-30 15:42:43
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: DB SUMMARY
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: DB Session ID:  D9EAUIZ0QV3Y04LRFPJ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: CURRENT file:  CURRENT
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: IDENTITY file:  IDENTITY
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                         Options.error_if_exists: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.create_if_missing: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                         Options.paranoid_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                                     Options.env: 0x55cf1bed1ea0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                                Options.info_log: 0x55cf1cf348a0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_file_opening_threads: 16
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                              Options.statistics: (nil)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.use_fsync: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.max_log_file_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                         Options.allow_fallocate: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.use_direct_reads: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.create_missing_column_families: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                              Options.db_log_dir: 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                                 Options.wal_dir: db.wal
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.advise_random_on_open: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.write_buffer_manager: 0x55cf1bf32b40
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                            Options.rate_limiter: (nil)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.unordered_write: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.row_cache: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                              Options.wal_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.allow_ingest_behind: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.two_write_queues: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.manual_wal_flush: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.wal_compression: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.atomic_flush: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.log_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.allow_data_in_errors: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.db_host_id: __hostname__
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.max_background_jobs: 4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.max_background_compactions: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.max_subcompactions: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.max_open_files: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.bytes_per_sync: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.max_background_flushes: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Compression algorithms supported:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kZSTD supported: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kXpressCompression supported: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kBZip2Compression supported: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kLZ4Compression supported: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kZlibCompression supported: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kLZ4HCCompression supported: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kSnappyCompression supported: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 8 set_numa_affinity not setting numa affinity
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-0[86055]: 2025-12-03T21:09:14.441+0000 7fb8587ac640 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[86059]: osd.0 8 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed5a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed5a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed5a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 755311dd-465d-446c-bb3d-52d79ad19b23
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796154448042, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796154450352, "job": 1, "event": "recovery_finished"}
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old nid_max 1025
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta old blobid_max 10240
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _open_super_meta min_alloc_size 0x1000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: freelist init
Dec 03 21:09:14 compute-0 ceph-osd[87094]: freelist _read_cfg
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs umount
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) close
Dec 03 21:09:14 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.0 v2:192.168.122.100:6802/1181083466; not ready for session (expect reconnect)
Dec 03 21:09:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:09:14 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:14 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.0: (2) No such file or directory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) open path /var/lib/ceph/osd/ceph-1/block
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-1/block failed: (22) Invalid argument
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bdev(0x55cf1ccd7800 /var/lib/ceph/osd/ceph-1/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-1/block size 20 GiB
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluefs mount shared_bdev_used = 27262976
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: RocksDB version: 7.9.2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Git sha 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Compile date 2025-10-30 15:42:43
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: DB SUMMARY
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: DB Session ID:  D9EAUIZ0QV3Y04LRFPJ5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: CURRENT file:  CURRENT
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: IDENTITY file:  IDENTITY
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                         Options.error_if_exists: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.create_if_missing: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                         Options.paranoid_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                                     Options.env: 0x55cf1d104a80
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                                Options.info_log: 0x55cf1cf34960
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_file_opening_threads: 16
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                              Options.statistics: (nil)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.use_fsync: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.max_log_file_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                         Options.allow_fallocate: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.use_direct_reads: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.create_missing_column_families: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                              Options.db_log_dir: 
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                                 Options.wal_dir: db.wal
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.advise_random_on_open: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.write_buffer_manager: 0x55cf1bf33900
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                            Options.rate_limiter: (nil)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.unordered_write: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.row_cache: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                              Options.wal_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.allow_ingest_behind: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.two_write_queues: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.manual_wal_flush: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.wal_compression: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.atomic_flush: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.log_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.allow_data_in_errors: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.db_host_id: __hostname__
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.max_background_jobs: 4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.max_background_compactions: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.max_subcompactions: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.max_open_files: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.bytes_per_sync: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.max_background_flushes: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Compression algorithms supported:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kZSTD supported: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kXpressCompression supported: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kBZip2Compression supported: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kLZ4Compression supported: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kZlibCompression supported: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kLZ4HCCompression supported: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         kSnappyCompression supported: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf34bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed58d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf350c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed5a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf350c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed5a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cf1cf350c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55cf1bed5a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 755311dd-465d-446c-bb3d-52d79ad19b23
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796154492309, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796154499591, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 131, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796154, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "755311dd-465d-446c-bb3d-52d79ad19b23", "db_session_id": "D9EAUIZ0QV3Y04LRFPJ5", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796154503045, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796154, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "755311dd-465d-446c-bb3d-52d79ad19b23", "db_session_id": "D9EAUIZ0QV3Y04LRFPJ5", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796154506144, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796154, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "755311dd-465d-446c-bb3d-52d79ad19b23", "db_session_id": "D9EAUIZ0QV3Y04LRFPJ5", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796154507663, "job": 1, "event": "recovery_finished"}
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 03 21:09:14 compute-0 ceph-mon[75204]: pgmap v25: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:14 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:14 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:14 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Dec 03 21:09:14 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:14 compute-0 ceph-mon[75204]: Deploying daemon osd.2 on compute-0
Dec 03 21:09:14 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55cf1d13c000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: DB pointer 0x55cf1d0ee000
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super from 4, latest 4
Dec 03 21:09:14 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _upgrade_super done
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:09:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 1.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:09:14 compute-0 ceph-osd[87094]: <cls> /ceph/rpmbuild/BUILD/ceph-20.2.0/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 03 21:09:14 compute-0 ceph-osd[87094]: <cls> /ceph/rpmbuild/BUILD/ceph-20.2.0/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 03 21:09:14 compute-0 ceph-osd[87094]: _get_class not permitted to load lua
Dec 03 21:09:14 compute-0 ceph-osd[87094]: _get_class not permitted to load sdk
Dec 03 21:09:14 compute-0 ceph-osd[87094]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 03 21:09:14 compute-0 ceph-osd[87094]: osd.1 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 03 21:09:14 compute-0 ceph-osd[87094]: osd.1 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 03 21:09:14 compute-0 ceph-osd[87094]: osd.1 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 03 21:09:14 compute-0 ceph-osd[87094]: osd.1 0 load_pgs
Dec 03 21:09:14 compute-0 ceph-osd[87094]: osd.1 0 load_pgs opened 0 pgs
Dec 03 21:09:14 compute-0 ceph-osd[87094]: osd.1 0 log_to_monitors true
Dec 03 21:09:14 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1[87090]: 2025-12-03T21:09:14.533+0000 7f53eff388c0 -1 osd.1 0 log_to_monitors true
Dec 03 21:09:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]} v 0)
Dec 03 21:09:14 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272]' entity='osd.1' cmd={"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]} : dispatch
Dec 03 21:09:14 compute-0 podman[87634]: 2025-12-03 21:09:14.708736381 +0000 UTC m=+0.035275942 container create b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_lederberg, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:09:14 compute-0 systemd[1]: Started libpod-conmon-b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d.scope.
Dec 03 21:09:14 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:14 compute-0 podman[87634]: 2025-12-03 21:09:14.692841216 +0000 UTC m=+0.019380797 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:14 compute-0 podman[87634]: 2025-12-03 21:09:14.79822942 +0000 UTC m=+0.124769071 container init b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_lederberg, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:14 compute-0 podman[87634]: 2025-12-03 21:09:14.81047807 +0000 UTC m=+0.137017671 container start b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_lederberg, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:14 compute-0 podman[87634]: 2025-12-03 21:09:14.815954702 +0000 UTC m=+0.142494353 container attach b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_lederberg, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 03 21:09:14 compute-0 funny_lederberg[87651]: 167 167
Dec 03 21:09:14 compute-0 systemd[1]: libpod-b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d.scope: Deactivated successfully.
Dec 03 21:09:14 compute-0 podman[87634]: 2025-12-03 21:09:14.819508255 +0000 UTC m=+0.146047826 container died b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_lederberg, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-42154d0bedce981c3e63acb395e76a58bba22a44b1870efb681fea14c7e84f8a-merged.mount: Deactivated successfully.
Dec 03 21:09:14 compute-0 podman[87634]: 2025-12-03 21:09:14.869761961 +0000 UTC m=+0.196301532 container remove b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_lederberg, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:14 compute-0 systemd[1]: libpod-conmon-b70b9acd0daa5f4217fa49e4e88ac2ba81aa7e49f0ed5bf4cae9195d1f025a5d.scope: Deactivated successfully.
Dec 03 21:09:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v26: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e8 do_prune osdmap full prune enabled
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e8 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:09:15 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e9 e9: 3 total, 1 up, 3 in
Dec 03 21:09:15 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466] boot
Dec 03 21:09:15 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e9: 3 total, 1 up, 3 in
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0)
Dec 03 21:09:15 compute-0 ceph-osd[86059]: osd.0 9 state: booting -> active
Dec 03 21:09:15 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272]' entity='osd.1' cmd={"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]} : dispatch
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e9 create-or-move crush item name 'osd.1' initial_weight 0.02 at location {host=compute-0,root=default}
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Dec 03 21:09:15 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:15 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:15 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:15 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:15 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:15 compute-0 podman[87679]: 2025-12-03 21:09:15.221228335 +0000 UTC m=+0.075946923 container create 182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:09:15 compute-0 systemd[1]: Started libpod-conmon-182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75.scope.
Dec 03 21:09:15 compute-0 podman[87679]: 2025-12-03 21:09:15.192766213 +0000 UTC m=+0.047484841 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:15 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe9eb4254c2426c9fa3c340e9102dbfed1471cfc4c0356005b5d0ec45829b9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe9eb4254c2426c9fa3c340e9102dbfed1471cfc4c0356005b5d0ec45829b9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe9eb4254c2426c9fa3c340e9102dbfed1471cfc4c0356005b5d0ec45829b9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe9eb4254c2426c9fa3c340e9102dbfed1471cfc4c0356005b5d0ec45829b9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fe9eb4254c2426c9fa3c340e9102dbfed1471cfc4c0356005b5d0ec45829b9/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:15 compute-0 podman[87679]: 2025-12-03 21:09:15.321207918 +0000 UTC m=+0.175926556 container init 182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:15 compute-0 podman[87679]: 2025-12-03 21:09:15.331911656 +0000 UTC m=+0.186630214 container start 182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:09:15 compute-0 podman[87679]: 2025-12-03 21:09:15.336033061 +0000 UTC m=+0.190751659 container attach 182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:09:15 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 03 21:09:15 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 03 21:09:15 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test[87695]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec 03 21:09:15 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test[87695]:                             [--no-systemd] [--no-tmpfs]
Dec 03 21:09:15 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test[87695]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 03 21:09:15 compute-0 systemd[1]: libpod-182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75.scope: Deactivated successfully.
Dec 03 21:09:15 compute-0 conmon[87695]: conmon 182336f97dc97be09dd7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75.scope/container/memory.events
Dec 03 21:09:15 compute-0 podman[87679]: 2025-12-03 21:09:15.528595906 +0000 UTC m=+0.383314464 container died 182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 03 21:09:15 compute-0 ceph-mon[75204]: OSD bench result of 9266.164411 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 03 21:09:15 compute-0 ceph-mon[75204]: from='osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272]' entity='osd.1' cmd={"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]} : dispatch
Dec 03 21:09:15 compute-0 ceph-mon[75204]: pgmap v26: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail
Dec 03 21:09:15 compute-0 ceph-mon[75204]: from='osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Dec 03 21:09:15 compute-0 ceph-mon[75204]: osd.0 [v2:192.168.122.100:6802/1181083466,v1:192.168.122.100:6803/1181083466] boot
Dec 03 21:09:15 compute-0 ceph-mon[75204]: osdmap e9: 3 total, 1 up, 3 in
Dec 03 21:09:15 compute-0 ceph-mon[75204]: from='osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272]' entity='osd.1' cmd={"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]} : dispatch
Dec 03 21:09:15 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Dec 03 21:09:15 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:15 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-56fe9eb4254c2426c9fa3c340e9102dbfed1471cfc4c0356005b5d0ec45829b9-merged.mount: Deactivated successfully.
Dec 03 21:09:15 compute-0 podman[87679]: 2025-12-03 21:09:15.580980966 +0000 UTC m=+0.435699514 container remove 182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:09:15 compute-0 systemd[1]: libpod-conmon-182336f97dc97be09dd763af6aaa47ea45eada61fce719fec02581e14c31ec75.scope: Deactivated successfully.
Dec 03 21:09:15 compute-0 ceph-mgr[75500]: [devicehealth INFO root] creating mgr pool
Dec 03 21:09:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true} v 0)
Dec 03 21:09:15 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true} : dispatch
Dec 03 21:09:15 compute-0 systemd[1]: Reloading.
Dec 03 21:09:16 compute-0 systemd-sysv-generator[87757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:09:16 compute-0 systemd-rc-local-generator[87753]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e9 do_prune osdmap full prune enabled
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e9 encode_pending skipping prime_pg_temp; mapping job did not start
Dec 03 21:09:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 03 21:09:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e10 e10: 3 total, 1 up, 3 in
Dec 03 21:09:16 compute-0 ceph-osd[87094]: osd.1 0 done with init, starting boot process
Dec 03 21:09:16 compute-0 ceph-osd[87094]: osd.1 0 start_boot
Dec 03 21:09:16 compute-0 ceph-osd[87094]: osd.1 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 03 21:09:16 compute-0 ceph-osd[87094]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 03 21:09:16 compute-0 ceph-osd[87094]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 03 21:09:16 compute-0 ceph-osd[87094]: osd.1 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 03 21:09:16 compute-0 ceph-osd[87094]: osd.1 0  bench count 12288000 bsize 4 KiB
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e10 crush map has features 3314933000852226048, adjusting msgr requires
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e10 crush map has features 288514051259236352, adjusting msgr requires
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e10 crush map has features 288514051259236352, adjusting msgr requires
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e10 crush map has features 288514051259236352, adjusting msgr requires
Dec 03 21:09:16 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e10: 3 total, 1 up, 3 in
Dec 03 21:09:16 compute-0 ceph-osd[86059]: osd.0 10 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 03 21:09:16 compute-0 ceph-osd[86059]: osd.0 10 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Dec 03 21:09:16 compute-0 ceph-osd[86059]: osd.0 10 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:16 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:16 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true} v 0)
Dec 03 21:09:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true} : dispatch
Dec 03 21:09:16 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:16 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:16 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/96508272; not ready for session (expect reconnect)
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:16 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:16 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:16 compute-0 systemd[1]: Reloading.
Dec 03 21:09:16 compute-0 systemd-rc-local-generator[87798]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:09:16 compute-0 systemd-sysv-generator[87802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:09:16 compute-0 systemd[1]: Starting Ceph osd.2 for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:09:16 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true} : dispatch
Dec 03 21:09:16 compute-0 ceph-mon[75204]: from='osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 03 21:09:16 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Dec 03 21:09:16 compute-0 ceph-mon[75204]: osdmap e10: 3 total, 1 up, 3 in
Dec 03 21:09:16 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:16 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:16 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true} : dispatch
Dec 03 21:09:16 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e10 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:16 compute-0 podman[87854]: 2025-12-03 21:09:16.824904697 +0000 UTC m=+0.066711033 container create fae8765a8ea84675703de8a96589308baab2490b03e534d120a72d0aeb771434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:16 compute-0 podman[87854]: 2025-12-03 21:09:16.791142417 +0000 UTC m=+0.032948803 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:16 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/841fa57832501ab18683aa900518b0bea3e0c517c3ba104b10f9784ef74c8840/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/841fa57832501ab18683aa900518b0bea3e0c517c3ba104b10f9784ef74c8840/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/841fa57832501ab18683aa900518b0bea3e0c517c3ba104b10f9784ef74c8840/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/841fa57832501ab18683aa900518b0bea3e0c517c3ba104b10f9784ef74c8840/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/841fa57832501ab18683aa900518b0bea3e0c517c3ba104b10f9784ef74c8840/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:16 compute-0 podman[87854]: 2025-12-03 21:09:16.944510002 +0000 UTC m=+0.186316328 container init fae8765a8ea84675703de8a96589308baab2490b03e534d120a72d0aeb771434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS)
Dec 03 21:09:16 compute-0 podman[87854]: 2025-12-03 21:09:16.952679199 +0000 UTC m=+0.194485495 container start fae8765a8ea84675703de8a96589308baab2490b03e534d120a72d0aeb771434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030)
Dec 03 21:09:16 compute-0 podman[87854]: 2025-12-03 21:09:16.966394209 +0000 UTC m=+0.208200515 container attach fae8765a8ea84675703de8a96589308baab2490b03e534d120a72d0aeb771434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v29: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 03 21:09:17 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:17 compute-0 bash[87854]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e10 do_prune osdmap full prune enabled
Dec 03 21:09:17 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/96508272; not ready for session (expect reconnect)
Dec 03 21:09:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:17 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Dec 03 21:09:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e11 e11: 3 total, 1 up, 3 in
Dec 03 21:09:17 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e11: 3 total, 1 up, 3 in
Dec 03 21:09:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:17 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:17 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:17 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:17 compute-0 bash[87854]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:17 compute-0 ceph-mon[75204]: purged_snaps scrub starts
Dec 03 21:09:17 compute-0 ceph-mon[75204]: purged_snaps scrub ok
Dec 03 21:09:17 compute-0 ceph-mon[75204]: pgmap v29: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 03 21:09:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Dec 03 21:09:17 compute-0 ceph-mon[75204]: osdmap e11: 3 total, 1 up, 3 in
Dec 03 21:09:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:17 compute-0 lvm[87955]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:09:17 compute-0 lvm[87955]: VG ceph_vg1 finished
Dec 03 21:09:17 compute-0 lvm[87954]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:09:17 compute-0 lvm[87954]: VG ceph_vg0 finished
Dec 03 21:09:17 compute-0 lvm[87957]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:09:17 compute-0 lvm[87957]: VG ceph_vg2 finished
Dec 03 21:09:17 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 03 21:09:17 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:17 compute-0 bash[87854]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 03 21:09:17 compute-0 bash[87854]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:17 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:17 compute-0 bash[87854]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 03 21:09:17 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 03 21:09:17 compute-0 bash[87854]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 03 21:09:17 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg2/ceph_lv2 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec 03 21:09:17 compute-0 bash[87854]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg2/ceph_lv2 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Dec 03 21:09:18 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/ln -snf /dev/ceph_vg2/ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 bash[87854]: Running command: /usr/bin/ln -snf /dev/ceph_vg2/ceph_lv2 /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 bash[87854]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec 03 21:09:18 compute-0 bash[87854]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2
Dec 03 21:09:18 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 03 21:09:18 compute-0 bash[87854]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Dec 03 21:09:18 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate[87869]: --> ceph-volume lvm activate successful for osd ID: 2
Dec 03 21:09:18 compute-0 bash[87854]: --> ceph-volume lvm activate successful for osd ID: 2
Dec 03 21:09:18 compute-0 systemd[1]: libpod-fae8765a8ea84675703de8a96589308baab2490b03e534d120a72d0aeb771434.scope: Deactivated successfully.
Dec 03 21:09:18 compute-0 systemd[1]: libpod-fae8765a8ea84675703de8a96589308baab2490b03e534d120a72d0aeb771434.scope: Consumed 1.633s CPU time.
Dec 03 21:09:18 compute-0 podman[88054]: 2025-12-03 21:09:18.162055904 +0000 UTC m=+0.033344522 container died fae8765a8ea84675703de8a96589308baab2490b03e534d120a72d0aeb771434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030)
Dec 03 21:09:18 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/96508272; not ready for session (expect reconnect)
Dec 03 21:09:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:18 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:18 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-841fa57832501ab18683aa900518b0bea3e0c517c3ba104b10f9784ef74c8840-merged.mount: Deactivated successfully.
Dec 03 21:09:18 compute-0 podman[88054]: 2025-12-03 21:09:18.275201126 +0000 UTC m=+0.146489694 container remove fae8765a8ea84675703de8a96589308baab2490b03e534d120a72d0aeb771434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2-activate, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:09:18 compute-0 podman[88110]: 2025-12-03 21:09:18.521088711 +0000 UTC m=+0.059622929 container create f54ced40cf6e3cfd17117967711c6e8c3d0af1904d4dd52bc53b3908241174db (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:18 compute-0 podman[88110]: 2025-12-03 21:09:18.488358542 +0000 UTC m=+0.026892850 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c66876297b08a692412578d395f689e9d9d959b66002228d001a170918bf222a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c66876297b08a692412578d395f689e9d9d959b66002228d001a170918bf222a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c66876297b08a692412578d395f689e9d9d959b66002228d001a170918bf222a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c66876297b08a692412578d395f689e9d9d959b66002228d001a170918bf222a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c66876297b08a692412578d395f689e9d9d959b66002228d001a170918bf222a/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:18 compute-0 podman[88110]: 2025-12-03 21:09:18.641003312 +0000 UTC m=+0.179537530 container init f54ced40cf6e3cfd17117967711c6e8c3d0af1904d4dd52bc53b3908241174db (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:18 compute-0 podman[88110]: 2025-12-03 21:09:18.648207589 +0000 UTC m=+0.186741797 container start f54ced40cf6e3cfd17117967711c6e8c3d0af1904d4dd52bc53b3908241174db (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:09:18 compute-0 bash[88110]: f54ced40cf6e3cfd17117967711c6e8c3d0af1904d4dd52bc53b3908241174db
Dec 03 21:09:18 compute-0 systemd[1]: Started Ceph osd.2 for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:09:18 compute-0 ceph-osd[88129]: set uid:gid to 167:167 (ceph:ceph)
Dec 03 21:09:18 compute-0 ceph-osd[88129]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-osd, pid 2
Dec 03 21:09:18 compute-0 ceph-osd[88129]: pidfile_write: ignore empty --pid-file
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 sudo[87145]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624400 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624400 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07624000 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 ceph-osd[88129]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Dec 03 21:09:18 compute-0 ceph-osd[88129]: load: jerasure load: lrc 
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 sudo[88149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:18 compute-0 sudo[88149]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:18 compute-0 sudo[88149]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 ceph-osd[88129]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 03 21:09:18 compute-0 ceph-osd[88129]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:18 compute-0 sudo[88183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:09:18 compute-0 sudo[88183]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:18 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f07625c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount shared_bdev_used = 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: RocksDB version: 7.9.2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Git sha 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Compile date 2025-10-30 15:42:43
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: DB SUMMARY
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: DB Session ID:  95T2HRKJRJBHLQ1U4F3O
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: CURRENT file:  CURRENT
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: IDENTITY file:  IDENTITY
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                         Options.error_if_exists: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.create_if_missing: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                         Options.paranoid_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                                     Options.env: 0x559f074b5ea0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                                Options.info_log: 0x559f085068a0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_file_opening_threads: 16
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                              Options.statistics: (nil)
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.use_fsync: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.max_log_file_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                         Options.allow_fallocate: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.use_direct_reads: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.create_missing_column_families: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                              Options.db_log_dir: 
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                                 Options.wal_dir: db.wal
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.advise_random_on_open: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.write_buffer_manager: 0x559f0751ab40
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                            Options.rate_limiter: (nil)
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.unordered_write: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.row_cache: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                              Options.wal_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.allow_ingest_behind: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.two_write_queues: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.manual_wal_flush: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.wal_compression: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.atomic_flush: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.log_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.allow_data_in_errors: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.db_host_id: __hostname__
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.max_background_jobs: 4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.max_background_compactions: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.max_subcompactions: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.max_open_files: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.bytes_per_sync: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.max_background_flushes: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Compression algorithms supported:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kZSTD supported: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kXpressCompression supported: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kBZip2Compression supported: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kLZ4Compression supported: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kZlibCompression supported: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kLZ4HCCompression supported: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kSnappyCompression supported: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b9a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b9a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506c80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b9a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5d37ca13-9b22-4d6f-b7f5-d136582f32d0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796159033233, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796159034800, "job": 1, "event": "recovery_finished"}
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: freelist init
Dec 03 21:09:19 compute-0 ceph-osd[88129]: freelist _read_cfg
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs umount
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) close
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bdev(0x559f082bb800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluefs mount shared_bdev_used = 27262976
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: RocksDB version: 7.9.2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Git sha 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Compile date 2025-10-30 15:42:43
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: DB SUMMARY
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: DB Session ID:  95T2HRKJRJBHLQ1U4F3P
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: CURRENT file:  CURRENT
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: IDENTITY file:  IDENTITY
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                         Options.error_if_exists: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.create_if_missing: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                         Options.paranoid_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                                     Options.env: 0x559f086d6a80
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                                Options.info_log: 0x559f0853b7c0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_file_opening_threads: 16
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                              Options.statistics: (nil)
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.use_fsync: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.max_log_file_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                         Options.allow_fallocate: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.use_direct_reads: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.create_missing_column_families: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                              Options.db_log_dir: 
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                                 Options.wal_dir: db.wal
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.advise_random_on_open: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.write_buffer_manager: 0x559f0751b900
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                            Options.rate_limiter: (nil)
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.unordered_write: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.row_cache: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                              Options.wal_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.allow_ingest_behind: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.two_write_queues: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.manual_wal_flush: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.wal_compression: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.atomic_flush: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.log_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.allow_data_in_errors: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.db_host_id: __hostname__
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.max_background_jobs: 4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.max_background_compactions: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.max_subcompactions: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.max_open_files: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.bytes_per_sync: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.max_background_flushes: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Compression algorithms supported:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kZSTD supported: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kXpressCompression supported: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kBZip2Compression supported: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kLZ4Compression supported: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kZlibCompression supported: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kLZ4HCCompression supported: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         kSnappyCompression supported: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f08506bc0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b98d0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f085070c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b9a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f085070c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b9a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:           Options.merge_operator: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.compaction_filter_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.sst_partitioner_factory: None
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559f085070c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x559f074b9a30
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.write_buffer_size: 16777216
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.max_write_buffer_number: 64
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.compression: LZ4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.num_levels: 7
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.level: 32767
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.compression_opts.strategy: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                  Options.compression_opts.enabled: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.arena_block_size: 1048576
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.disable_auto_compactions: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.inplace_update_support: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.bloom_locality: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                    Options.max_successive_merges: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.paranoid_file_checks: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.force_consistency_checks: 1
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.report_bg_io_stats: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                               Options.ttl: 2592000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                       Options.enable_blob_files: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                           Options.min_blob_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                          Options.blob_file_size: 268435456
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb:                Options.blob_file_starting_level: 0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 5d37ca13-9b22-4d6f-b7f5-d136582f32d0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796159096395, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796159101160, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 131, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796159, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5d37ca13-9b22-4d6f-b7f5-d136582f32d0", "db_session_id": "95T2HRKJRJBHLQ1U4F3P", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796159111443, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796159, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5d37ca13-9b22-4d6f-b7f5-d136582f32d0", "db_session_id": "95T2HRKJRJBHLQ1U4F3P", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796159113999, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796159, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "5d37ca13-9b22-4d6f-b7f5-d136582f32d0", "db_session_id": "95T2HRKJRJBHLQ1U4F3P", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796159122382, "job": 1, "event": "recovery_finished"}
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 03 21:09:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v31: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x559f08720000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: DB pointer 0x559f086c0000
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Dec 03 21:09:19 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:09:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:09:19 compute-0 ceph-osd[88129]: <cls> /ceph/rpmbuild/BUILD/ceph-20.2.0/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 03 21:09:19 compute-0 ceph-osd[88129]: <cls> /ceph/rpmbuild/BUILD/ceph-20.2.0/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 03 21:09:19 compute-0 ceph-osd[88129]: _get_class not permitted to load lua
Dec 03 21:09:19 compute-0 ceph-osd[88129]: _get_class not permitted to load sdk
Dec 03 21:09:19 compute-0 ceph-osd[88129]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 03 21:09:19 compute-0 ceph-osd[88129]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 03 21:09:19 compute-0 ceph-osd[88129]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 03 21:09:19 compute-0 ceph-osd[88129]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 03 21:09:19 compute-0 ceph-osd[88129]: osd.2 0 load_pgs
Dec 03 21:09:19 compute-0 ceph-osd[88129]: osd.2 0 load_pgs opened 0 pgs
Dec 03 21:09:19 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2[88125]: 2025-12-03T21:09:19.177+0000 7f53bfdc88c0 -1 osd.2 0 log_to_monitors true
Dec 03 21:09:19 compute-0 ceph-osd[88129]: osd.2 0 log_to_monitors true
Dec 03 21:09:19 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.1 v2:192.168.122.100:6806/96508272; not ready for session (expect reconnect)
Dec 03 21:09:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:19 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:19 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.1: (2) No such file or directory
Dec 03 21:09:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Dec 03 21:09:19 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625]' entity='osd.2' cmd={"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} : dispatch
Dec 03 21:09:19 compute-0 podman[88643]: 2025-12-03 21:09:19.287631166 +0000 UTC m=+0.048196026 container create 89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bhabha, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:19 compute-0 systemd[1]: Started libpod-conmon-89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3.scope.
Dec 03 21:09:19 compute-0 podman[88643]: 2025-12-03 21:09:19.263648286 +0000 UTC m=+0.024213146 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:19 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:19 compute-0 podman[88643]: 2025-12-03 21:09:19.399344459 +0000 UTC m=+0.159909309 container init 89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bhabha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 28.185 iops: 7215.454 elapsed_sec: 0.416
Dec 03 21:09:19 compute-0 ceph-osd[87094]: log_channel(cluster) log [WRN] : OSD bench result of 7215.453955 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 0 waiting for initial osdmap
Dec 03 21:09:19 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1[87090]: 2025-12-03T21:09:19.399+0000 7f53ebeba640 -1 osd.1 0 waiting for initial osdmap
Dec 03 21:09:19 compute-0 podman[88643]: 2025-12-03 21:09:19.407590037 +0000 UTC m=+0.168154877 container start 89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bhabha, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:09:19 compute-0 systemd[1]: libpod-89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3.scope: Deactivated successfully.
Dec 03 21:09:19 compute-0 dazzling_bhabha[88659]: 167 167
Dec 03 21:09:19 compute-0 conmon[88659]: conmon 89d985d660b3b30b51c3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3.scope/container/memory.events
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 11 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 11 check_osdmap_features require_osd_release unknown -> tentacle
Dec 03 21:09:19 compute-0 podman[88643]: 2025-12-03 21:09:19.416064041 +0000 UTC m=+0.176628881 container attach 89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bhabha, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:19 compute-0 podman[88643]: 2025-12-03 21:09:19.416423578 +0000 UTC m=+0.176988418 container died 89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bhabha, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 11 set_numa_affinity not setting numa affinity
Dec 03 21:09:19 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-1[87090]: 2025-12-03T21:09:19.428+0000 7f53e6cbf640 -1 osd.1 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 11 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial no unique device path for loop4: no symlink to loop4 in /dev/disk/by-path
Dec 03 21:09:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-076f3776f9582382c44a1592f150237b9f1089ff3d4ef162232615b62d092623-merged.mount: Deactivated successfully.
Dec 03 21:09:19 compute-0 podman[88643]: 2025-12-03 21:09:19.471778639 +0000 UTC m=+0.232343479 container remove 89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bhabha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:09:19 compute-0 systemd[1]: libpod-conmon-89d985d660b3b30b51c3fea31304ab2ebe184ff5ac88445d0b1d3ca68855b0b3.scope: Deactivated successfully.
Dec 03 21:09:19 compute-0 podman[88683]: 2025-12-03 21:09:19.624403529 +0000 UTC m=+0.047582314 container create 64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stoic_jones, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:19 compute-0 systemd[1]: Started libpod-conmon-64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e.scope.
Dec 03 21:09:19 compute-0 podman[88683]: 2025-12-03 21:09:19.603911589 +0000 UTC m=+0.027090394 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:19 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89d6cc1050b3e1e425814b298a9067592341e4c8e1345bd8a1e3cc4dd42d882b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89d6cc1050b3e1e425814b298a9067592341e4c8e1345bd8a1e3cc4dd42d882b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89d6cc1050b3e1e425814b298a9067592341e4c8e1345bd8a1e3cc4dd42d882b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89d6cc1050b3e1e425814b298a9067592341e4c8e1345bd8a1e3cc4dd42d882b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:19 compute-0 podman[88683]: 2025-12-03 21:09:19.729054957 +0000 UTC m=+0.152233792 container init 64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stoic_jones, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:19 compute-0 podman[88683]: 2025-12-03 21:09:19.74482061 +0000 UTC m=+0.167999435 container start 64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stoic_jones, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 03 21:09:19 compute-0 podman[88683]: 2025-12-03 21:09:19.749325791 +0000 UTC m=+0.172504626 container attach 64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stoic_jones, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:19 compute-0 ceph-mon[75204]: pgmap v31: 1 pgs: 1 unknown; 0 B data, 426 MiB used, 20 GiB / 20 GiB avail
Dec 03 21:09:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:19 compute-0 ceph-mon[75204]: from='osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625]' entity='osd.2' cmd={"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} : dispatch
Dec 03 21:09:19 compute-0 ceph-mon[75204]: OSD bench result of 7215.453955 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 03 21:09:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e11 do_prune osdmap full prune enabled
Dec 03 21:09:19 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec 03 21:09:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e12 e12: 3 total, 2 up, 3 in
Dec 03 21:09:19 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272] boot
Dec 03 21:09:19 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e12: 3 total, 2 up, 3 in
Dec 03 21:09:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]} v 0)
Dec 03 21:09:19 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625]' entity='osd.2' cmd={"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]} : dispatch
Dec 03 21:09:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e12 create-or-move crush item name 'osd.2' initial_weight 0.02 at location {host=compute-0,root=default}
Dec 03 21:09:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Dec 03 21:09:19 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:19 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:19 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 12 state: booting -> active
Dec 03 21:09:19 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 12 pg[1.0( empty local-lis/les=0/0 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=12) [1] r=0 lpr=12 pi=[10,12)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:09:20 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 03 21:09:20 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 03 21:09:20 compute-0 lvm[88774]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:09:20 compute-0 lvm[88774]: VG ceph_vg0 finished
Dec 03 21:09:20 compute-0 lvm[88776]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:09:20 compute-0 lvm[88776]: VG ceph_vg1 finished
Dec 03 21:09:20 compute-0 lvm[88777]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:09:20 compute-0 lvm[88777]: VG ceph_vg2 finished
Dec 03 21:09:20 compute-0 stoic_jones[88699]: {}
Dec 03 21:09:20 compute-0 systemd[1]: libpod-64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e.scope: Deactivated successfully.
Dec 03 21:09:20 compute-0 systemd[1]: libpod-64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e.scope: Consumed 1.407s CPU time.
Dec 03 21:09:20 compute-0 podman[88683]: 2025-12-03 21:09:20.649179732 +0000 UTC m=+1.072358527 container died 64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stoic_jones, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-89d6cc1050b3e1e425814b298a9067592341e4c8e1345bd8a1e3cc4dd42d882b-merged.mount: Deactivated successfully.
Dec 03 21:09:20 compute-0 podman[88683]: 2025-12-03 21:09:20.704267127 +0000 UTC m=+1.127445922 container remove 64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stoic_jones, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030)
Dec 03 21:09:20 compute-0 systemd[1]: libpod-conmon-64484685d43764b16eacfee894dc8b6359e3f105d33136f2b53233df5aaddd0e.scope: Deactivated successfully.
Dec 03 21:09:20 compute-0 sudo[88183]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:20 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:20 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e12 do_prune osdmap full prune enabled
Dec 03 21:09:20 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 03 21:09:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e13 e13: 3 total, 2 up, 3 in
Dec 03 21:09:20 compute-0 ceph-osd[88129]: osd.2 0 done with init, starting boot process
Dec 03 21:09:20 compute-0 ceph-osd[88129]: osd.2 0 start_boot
Dec 03 21:09:20 compute-0 ceph-osd[88129]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 03 21:09:20 compute-0 ceph-osd[88129]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 03 21:09:20 compute-0 ceph-osd[88129]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 03 21:09:20 compute-0 ceph-osd[88129]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 03 21:09:20 compute-0 ceph-osd[88129]: osd.2 0  bench count 12288000 bsize 4 KiB
Dec 03 21:09:20 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e13: 3 total, 2 up, 3 in
Dec 03 21:09:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:20 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:20 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 13 pg[1.0( empty local-lis/les=12/13 n=0 ec=10/10 lis/c=0/0 les/c/f=0/0/0 sis=12) [1] r=0 lpr=12 pi=[10,12)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:09:20 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:20 compute-0 ceph-mon[75204]: from='osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec 03 21:09:20 compute-0 ceph-mon[75204]: osd.1 [v2:192.168.122.100:6806/96508272,v1:192.168.122.100:6807/96508272] boot
Dec 03 21:09:20 compute-0 ceph-mon[75204]: osdmap e12: 3 total, 2 up, 3 in
Dec 03 21:09:20 compute-0 ceph-mon[75204]: from='osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625]' entity='osd.2' cmd={"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]} : dispatch
Dec 03 21:09:20 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Dec 03 21:09:20 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:20 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:20 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:20 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/563643625; not ready for session (expect reconnect)
Dec 03 21:09:20 compute-0 sudo[88793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:09:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:20 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:20 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:20 compute-0 sudo[88793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:20 compute-0 sudo[88793]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:20 compute-0 ceph-mgr[75500]: [devicehealth INFO root] creating main.db for devicehealth
Dec 03 21:09:20 compute-0 sudo[88819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:20 compute-0 sudo[88819]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:20 compute-0 sudo[88819]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:21 compute-0 sudo[88844]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:09:21 compute-0 sudo[88844]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [devicehealth INFO root] Check health
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [devicehealth ERROR root] Fail to parse JSON result from daemon osd.2 ()
Dec 03 21:09:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Dec 03 21:09:21 compute-0 sudo[88880]:     ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda
Dec 03 21:09:21 compute-0 sudo[88880]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 03 21:09:21 compute-0 sudo[88880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=167)
Dec 03 21:09:21 compute-0 sudo[88880]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v34: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec 03 21:09:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Dec 03 21:09:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon metadata", "id": "compute-0"} v 0)
Dec 03 21:09:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata", "id": "compute-0"} : dispatch
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:09:21
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Some PGs (1.000000) are inactive; try again later
Dec 03 21:09:21 compute-0 podman[88925]: 2025-12-03 21:09:21.509292719 +0000 UTC m=+0.083584949 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:09:21 compute-0 podman[88925]: 2025-12-03 21:09:21.60089898 +0000 UTC m=+0.175191240 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:09:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e13 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 42941284352
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 1 (current 1)
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:09:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e13 do_prune osdmap full prune enabled
Dec 03 21:09:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e14 e14: 3 total, 2 up, 3 in
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/563643625; not ready for session (expect reconnect)
Dec 03 21:09:21 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e14: 3 total, 2 up, 3 in
Dec 03 21:09:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:21 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:21 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mgrmap e9: compute-0.jxauqt(active, since 60s)
Dec 03 21:09:21 compute-0 ceph-mon[75204]: from='osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Dec 03 21:09:21 compute-0 ceph-mon[75204]: osdmap e13: 3 total, 2 up, 3 in
Dec 03 21:09:21 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:21 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:21 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Dec 03 21:09:21 compute-0 ceph-mon[75204]: pgmap v34: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec 03 21:09:21 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Dec 03 21:09:21 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mon metadata", "id": "compute-0"} : dispatch
Dec 03 21:09:22 compute-0 sudo[88844]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:22 compute-0 sudo[89075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:22 compute-0 sudo[89075]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:22 compute-0 sudo[89075]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:22 compute-0 sudo[89100]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- inventory --format=json-pretty --filter-for-batch
Dec 03 21:09:22 compute-0 sudo[89100]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:22 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/563643625; not ready for session (expect reconnect)
Dec 03 21:09:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:22 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:22 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:22 compute-0 podman[89137]: 2025-12-03 21:09:22.859750267 +0000 UTC m=+0.058983156 container create eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_kepler, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:22 compute-0 systemd[1]: Started libpod-conmon-eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70.scope.
Dec 03 21:09:22 compute-0 podman[89137]: 2025-12-03 21:09:22.829922847 +0000 UTC m=+0.029155846 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:22 compute-0 ceph-mon[75204]: purged_snaps scrub starts
Dec 03 21:09:22 compute-0 ceph-mon[75204]: purged_snaps scrub ok
Dec 03 21:09:22 compute-0 ceph-mon[75204]: osdmap e14: 3 total, 2 up, 3 in
Dec 03 21:09:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:22 compute-0 ceph-mon[75204]: mgrmap e9: compute-0.jxauqt(active, since 60s)
Dec 03 21:09:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:22 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:22 compute-0 podman[89137]: 2025-12-03 21:09:22.962384724 +0000 UTC m=+0.161617633 container init eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_kepler, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:09:22 compute-0 podman[89137]: 2025-12-03 21:09:22.972373109 +0000 UTC m=+0.171605998 container start eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_kepler, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:22 compute-0 objective_kepler[89153]: 167 167
Dec 03 21:09:22 compute-0 systemd[1]: libpod-eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70.scope: Deactivated successfully.
Dec 03 21:09:22 compute-0 conmon[89153]: conmon eadc40c29fbd4bab5aae <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70.scope/container/memory.events
Dec 03 21:09:22 compute-0 podman[89137]: 2025-12-03 21:09:22.993259335 +0000 UTC m=+0.192492244 container attach eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_kepler, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:22 compute-0 podman[89137]: 2025-12-03 21:09:22.993611312 +0000 UTC m=+0.192844191 container died eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_kepler, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-46c2880fd10ce639b18faef50696152f9bccbc756b9c4eb10b32f5541eccade6-merged.mount: Deactivated successfully.
Dec 03 21:09:23 compute-0 podman[89137]: 2025-12-03 21:09:23.125633731 +0000 UTC m=+0.324866620 container remove eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_kepler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:23 compute-0 systemd[1]: libpod-conmon-eadc40c29fbd4bab5aaefb960ef56704f8904ab0760d1bdf019d43755fa9dc70.scope: Deactivated successfully.
Dec 03 21:09:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v36: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec 03 21:09:23 compute-0 sudo[89209]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldeobjpvrbzbnrbudgjlkfjrxrfbzcvm ; /usr/bin/python3'
Dec 03 21:09:23 compute-0 sudo[89209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:23 compute-0 podman[89188]: 2025-12-03 21:09:23.330896855 +0000 UTC m=+0.079521166 container create 4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=bold_einstein, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:09:23 compute-0 systemd[1]: Started libpod-conmon-4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e.scope.
Dec 03 21:09:23 compute-0 podman[89188]: 2025-12-03 21:09:23.281086697 +0000 UTC m=+0.029711038 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:23 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/720e15975c0d0b7ce12c9bc415c5afbf6282ccce5fbd87d4f61b425435e1c855/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/720e15975c0d0b7ce12c9bc415c5afbf6282ccce5fbd87d4f61b425435e1c855/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/720e15975c0d0b7ce12c9bc415c5afbf6282ccce5fbd87d4f61b425435e1c855/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/720e15975c0d0b7ce12c9bc415c5afbf6282ccce5fbd87d4f61b425435e1c855/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:23 compute-0 python3[89216]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /home/ceph-admin/specs/ceph_spec.yaml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .osdmap.num_up_osds _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:23 compute-0 podman[89188]: 2025-12-03 21:09:23.436050614 +0000 UTC m=+0.184675005 container init 4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=bold_einstein, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:09:23 compute-0 podman[89188]: 2025-12-03 21:09:23.447418376 +0000 UTC m=+0.196042687 container start 4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=bold_einstein, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030)
Dec 03 21:09:23 compute-0 podman[89188]: 2025-12-03 21:09:23.461102046 +0000 UTC m=+0.209726377 container attach 4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=bold_einstein, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:23 compute-0 podman[89226]: 2025-12-03 21:09:23.515371105 +0000 UTC m=+0.077069756 container create 19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6 (image=quay.io/ceph/ceph:v20, name=magical_taussig, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:23 compute-0 systemd[1]: Started libpod-conmon-19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6.scope.
Dec 03 21:09:23 compute-0 podman[89226]: 2025-12-03 21:09:23.470212593 +0000 UTC m=+0.031911324 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:23 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/873fe2228dc62cfe9aed1332bdfed7a339a959c1d31123a871a0ee7e7081789a/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/873fe2228dc62cfe9aed1332bdfed7a339a959c1d31123a871a0ee7e7081789a/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/873fe2228dc62cfe9aed1332bdfed7a339a959c1d31123a871a0ee7e7081789a/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:23 compute-0 podman[89226]: 2025-12-03 21:09:23.615533552 +0000 UTC m=+0.177232313 container init 19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6 (image=quay.io/ceph/ceph:v20, name=magical_taussig, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:23 compute-0 podman[89226]: 2025-12-03 21:09:23.62278348 +0000 UTC m=+0.184482131 container start 19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6 (image=quay.io/ceph/ceph:v20, name=magical_taussig, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:23 compute-0 podman[89226]: 2025-12-03 21:09:23.646773601 +0000 UTC m=+0.208472272 container attach 19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6 (image=quay.io/ceph/ceph:v20, name=magical_taussig, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:23 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/563643625; not ready for session (expect reconnect)
Dec 03 21:09:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:23 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:23 compute-0 ceph-mon[75204]: pgmap v36: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec 03 21:09:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:24 compute-0 bold_einstein[89222]: [
Dec 03 21:09:24 compute-0 bold_einstein[89222]:     {
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "available": false,
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "being_replaced": false,
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "ceph_device_lvm": false,
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "lsm_data": {},
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "lvs": [],
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "path": "/dev/sr0",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "rejected_reasons": [
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "Insufficient space (<5GB)",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "Has a FileSystem"
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         ],
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         "sys_api": {
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "actuators": null,
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "device_nodes": [
Dec 03 21:09:24 compute-0 bold_einstein[89222]:                 "sr0"
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             ],
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "devname": "sr0",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "human_readable_size": "482.00 KB",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "id_bus": "ata",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "model": "QEMU DVD-ROM",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "nr_requests": "2",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "parent": "/dev/sr0",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "partitions": {},
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "path": "/dev/sr0",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "removable": "1",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "rev": "2.5+",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "ro": "0",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "rotational": "1",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "sas_address": "",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "sas_device_handle": "",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "scheduler_mode": "mq-deadline",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "sectors": 0,
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "sectorsize": "2048",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "size": 493568.0,
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "support_discard": "2048",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "type": "disk",
Dec 03 21:09:24 compute-0 bold_einstein[89222]:             "vendor": "QEMU"
Dec 03 21:09:24 compute-0 bold_einstein[89222]:         }
Dec 03 21:09:24 compute-0 bold_einstein[89222]:     }
Dec 03 21:09:24 compute-0 bold_einstein[89222]: ]
Dec 03 21:09:24 compute-0 systemd[1]: libpod-4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e.scope: Deactivated successfully.
Dec 03 21:09:24 compute-0 podman[89188]: 2025-12-03 21:09:24.083324671 +0000 UTC m=+0.831948972 container died 4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=bold_einstein, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:09:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-720e15975c0d0b7ce12c9bc415c5afbf6282ccce5fbd87d4f61b425435e1c855-merged.mount: Deactivated successfully.
Dec 03 21:09:24 compute-0 podman[89188]: 2025-12-03 21:09:24.156016297 +0000 UTC m=+0.904640598 container remove 4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=bold_einstein, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/573438380' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 03 21:09:24 compute-0 magical_taussig[89244]: 
Dec 03 21:09:24 compute-0 magical_taussig[89244]: {"fsid":"c21de27e-a7fd-594b-8324-0697ba9aab3a","health":{"status":"HEALTH_OK","checks":{},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":82,"monmap":{"epoch":1,"min_mon_release_name":"tentacle","num_mons":1},"osdmap":{"epoch":14,"num_osds":3,"num_up_osds":2,"osd_up_since":1764796159,"num_in_osds":3,"osd_in_since":1764796141,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"creating+peering","count":1}],"num_pgs":1,"num_pools":1,"num_objects":0,"data_bytes":0,"bytes_used":894058496,"bytes_avail":42047225856,"bytes_total":42941284352,"inactive_pgs_ratio":1},"fsmap":{"epoch":1,"btime":"2025-12-03T21:07:59:373870+0000","by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs"],"services":{}},"servicemap":{"epoch":2,"modified":"2025-12-03T21:09:23.137474+0000","services":{"osd":{"daemons":{"summary":"","0":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}},"1":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}}}}}},"progress_events":{}}
Dec 03 21:09:24 compute-0 systemd[1]: libpod-conmon-4e464f41b43a8694d885396947f179f332e72481d93b792055effec957f5f58e.scope: Deactivated successfully.
Dec 03 21:09:24 compute-0 systemd[1]: libpod-19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6.scope: Deactivated successfully.
Dec 03 21:09:24 compute-0 podman[89226]: 2025-12-03 21:09:24.185660653 +0000 UTC m=+0.747359304 container died 19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6 (image=quay.io/ceph/ceph:v20, name=magical_taussig, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 03 21:09:24 compute-0 sudo[89100]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-873fe2228dc62cfe9aed1332bdfed7a339a959c1d31123a871a0ee7e7081789a-merged.mount: Deactivated successfully.
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mgr[75500]: [cephadm INFO root] Adjusting osd_memory_target on compute-0 to 43689k
Dec 03 21:09:24 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on compute-0 to 43689k
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Dec 03 21:09:24 compute-0 ceph-mgr[75500]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on compute-0 to 44738286: error parsing value: Value '44738286' is below minimum 939524096
Dec 03 21:09:24 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on compute-0 to 44738286: error parsing value: Value '44738286' is below minimum 939524096
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:24 compute-0 podman[89226]: 2025-12-03 21:09:24.262614996 +0000 UTC m=+0.824313647 container remove 19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6 (image=quay.io/ceph/ceph:v20, name=magical_taussig, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:24 compute-0 systemd[1]: libpod-conmon-19a9774ed7730953ef577f760610f6a67d4c1a4aeb8c74c9b1b8206df763a9c6.scope: Deactivated successfully.
Dec 03 21:09:24 compute-0 sudo[89209]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:24 compute-0 sudo[90077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:24 compute-0 sudo[90077]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:24 compute-0 sudo[90077]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:24 compute-0 sudo[90102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:09:24 compute-0 sudo[90102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 28.496 iops: 7294.884 elapsed_sec: 0.411
Dec 03 21:09:24 compute-0 ceph-osd[88129]: log_channel(cluster) log [WRN] : OSD bench result of 7294.884357 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 0 waiting for initial osdmap
Dec 03 21:09:24 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2[88125]: 2025-12-03T21:09:24.461+0000 7f53bc55c640 -1 osd.2 0 waiting for initial osdmap
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 14 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 14 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 14 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 14 check_osdmap_features require_osd_release unknown -> tentacle
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 14 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 14 set_numa_affinity not setting numa affinity
Dec 03 21:09:24 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-osd-2[88125]: 2025-12-03T21:09:24.490+0000 7f53b6b4f640 -1 osd.2 14 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 03 21:09:24 compute-0 ceph-osd[88129]: osd.2 14 _collect_metadata loop5:  no unique device id for loop5: fallback method has no model nor serial no unique device path for loop5: no symlink to loop5 in /dev/disk/by-path
Dec 03 21:09:24 compute-0 sudo[90150]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjsalwiutjhcswoqoqowdpjhupaiqrxq ; /usr/bin/python3'
Dec 03 21:09:24 compute-0 sudo[90150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:24 compute-0 podman[90165]: 2025-12-03 21:09:24.700829602 +0000 UTC m=+0.067084883 container create ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_bartik, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:24 compute-0 python3[90152]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create vms  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:24 compute-0 systemd[1]: Started libpod-conmon-ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5.scope.
Dec 03 21:09:24 compute-0 podman[90165]: 2025-12-03 21:09:24.673811589 +0000 UTC m=+0.040066980 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:24 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:24 compute-0 podman[90179]: 2025-12-03 21:09:24.777506708 +0000 UTC m=+0.048910350 container create fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6 (image=quay.io/ceph/ceph:v20, name=suspicious_wing, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 03 21:09:24 compute-0 podman[90165]: 2025-12-03 21:09:24.796269962 +0000 UTC m=+0.162525313 container init ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_bartik, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default)
Dec 03 21:09:24 compute-0 podman[90165]: 2025-12-03 21:09:24.804640453 +0000 UTC m=+0.170895734 container start ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_bartik, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default)
Dec 03 21:09:24 compute-0 podman[90165]: 2025-12-03 21:09:24.808392219 +0000 UTC m=+0.174647540 container attach ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_bartik, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:24 compute-0 interesting_bartik[90192]: 167 167
Dec 03 21:09:24 compute-0 podman[90165]: 2025-12-03 21:09:24.811682376 +0000 UTC m=+0.177937717 container died ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_bartik, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:09:24 compute-0 systemd[1]: Started libpod-conmon-fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6.scope.
Dec 03 21:09:24 compute-0 systemd[1]: libpod-ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5.scope: Deactivated successfully.
Dec 03 21:09:24 compute-0 ceph-mgr[75500]: mgr.server handle_open ignoring open from osd.2 v2:192.168.122.100:6810/563643625; not ready for session (expect reconnect)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:24 compute-0 ceph-mgr[75500]: mgr finish mon failed to return metadata for osd.2: (2) No such file or directory
Dec 03 21:09:24 compute-0 podman[90179]: 2025-12-03 21:09:24.751974896 +0000 UTC m=+0.023378588 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:24 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8652d559628f04af2c3ee4f098ff5f5f58aeee442b79793aeb1ff92c5bd81de/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8652d559628f04af2c3ee4f098ff5f5f58aeee442b79793aeb1ff92c5bd81de/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-a94b14fe4b1b8d5ce6484caa2465d6e7bb404a5acaf57f9bd6770f6093e60e9e-merged.mount: Deactivated successfully.
Dec 03 21:09:24 compute-0 podman[90179]: 2025-12-03 21:09:24.88035703 +0000 UTC m=+0.151760692 container init fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6 (image=quay.io/ceph/ceph:v20, name=suspicious_wing, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:24 compute-0 podman[90165]: 2025-12-03 21:09:24.889196551 +0000 UTC m=+0.255451872 container remove ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_bartik, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:09:24 compute-0 podman[90179]: 2025-12-03 21:09:24.891062859 +0000 UTC m=+0.162466471 container start fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6 (image=quay.io/ceph/ceph:v20, name=suspicious_wing, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:09:24 compute-0 podman[90179]: 2025-12-03 21:09:24.8945528 +0000 UTC m=+0.165956732 container attach fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6 (image=quay.io/ceph/ceph:v20, name=suspicious_wing, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:24 compute-0 systemd[1]: libpod-conmon-ea60f694b1e6af445a8ce81aa2e95dd92618a88d8843915bf3f53c68981525d5.scope: Deactivated successfully.
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/573438380' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: Adjusting osd_memory_target on compute-0 to 43689k
Dec 03 21:09:24 compute-0 ceph-mon[75204]: Unable to set osd_memory_target on compute-0 to 44738286: error parsing value: Value '44738286' is below minimum 939524096
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:25 compute-0 podman[90224]: 2025-12-03 21:09:25.056685784 +0000 UTC m=+0.048711157 container create d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_noether, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:09:25 compute-0 systemd[1]: Started libpod-conmon-d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e.scope.
Dec 03 21:09:25 compute-0 podman[90224]: 2025-12-03 21:09:25.034018551 +0000 UTC m=+0.026043904 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v37: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec 03 21:09:25 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f28e964939078fd006a1453b06ba24f189fa6bcf6b2420c3f514e16304d8f79/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f28e964939078fd006a1453b06ba24f189fa6bcf6b2420c3f514e16304d8f79/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f28e964939078fd006a1453b06ba24f189fa6bcf6b2420c3f514e16304d8f79/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f28e964939078fd006a1453b06ba24f189fa6bcf6b2420c3f514e16304d8f79/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f28e964939078fd006a1453b06ba24f189fa6bcf6b2420c3f514e16304d8f79/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:25 compute-0 podman[90224]: 2025-12-03 21:09:25.166252313 +0000 UTC m=+0.158277686 container init d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_noether, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:25 compute-0 podman[90224]: 2025-12-03 21:09:25.175059573 +0000 UTC m=+0.167084906 container start d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_noether, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:09:25 compute-0 podman[90224]: 2025-12-03 21:09:25.179003154 +0000 UTC m=+0.171028547 container attach d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_noether, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e14 do_prune osdmap full prune enabled
Dec 03 21:09:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e15 e15: 3 total, 3 up, 3 in
Dec 03 21:09:25 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625] boot
Dec 03 21:09:25 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e15: 3 total, 3 up, 3 in
Dec 03 21:09:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Dec 03 21:09:25 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:25 compute-0 ceph-osd[88129]: osd.2 15 state: booting -> active
Dec 03 21:09:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0)
Dec 03 21:09:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3946086394' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:25 compute-0 sweet_noether[90259]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:09:25 compute-0 sweet_noether[90259]: --> All data devices are unavailable
Dec 03 21:09:25 compute-0 systemd[1]: libpod-d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e.scope: Deactivated successfully.
Dec 03 21:09:25 compute-0 podman[90224]: 2025-12-03 21:09:25.813050971 +0000 UTC m=+0.805076344 container died d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_noether, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-5f28e964939078fd006a1453b06ba24f189fa6bcf6b2420c3f514e16304d8f79-merged.mount: Deactivated successfully.
Dec 03 21:09:25 compute-0 podman[90224]: 2025-12-03 21:09:25.869968164 +0000 UTC m=+0.861993537 container remove d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_noether, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:25 compute-0 systemd[1]: libpod-conmon-d3bd1e8370eeaeb7f7b50dfe6a29532dda7bc1ae03e1ade06a46cfdc30ba689e.scope: Deactivated successfully.
Dec 03 21:09:25 compute-0 sudo[90102]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:25 compute-0 ceph-mon[75204]: OSD bench result of 7294.884357 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 03 21:09:25 compute-0 ceph-mon[75204]: pgmap v37: 1 pgs: 1 creating+peering; 0 B data, 853 MiB used, 39 GiB / 40 GiB avail
Dec 03 21:09:25 compute-0 ceph-mon[75204]: osd.2 [v2:192.168.122.100:6810/563643625,v1:192.168.122.100:6811/563643625] boot
Dec 03 21:09:25 compute-0 ceph-mon[75204]: osdmap e15: 3 total, 3 up, 3 in
Dec 03 21:09:25 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Dec 03 21:09:25 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3946086394' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:26 compute-0 sudo[90293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:26 compute-0 sudo[90293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:26 compute-0 sudo[90293]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:26 compute-0 sudo[90318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:09:26 compute-0 sudo[90318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e15 do_prune osdmap full prune enabled
Dec 03 21:09:26 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3946086394' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e16 e16: 3 total, 3 up, 3 in
Dec 03 21:09:26 compute-0 suspicious_wing[90203]: pool 'vms' created
Dec 03 21:09:26 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e16: 3 total, 3 up, 3 in
Dec 03 21:09:26 compute-0 systemd[1]: libpod-fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6.scope: Deactivated successfully.
Dec 03 21:09:26 compute-0 podman[90179]: 2025-12-03 21:09:26.298875709 +0000 UTC m=+1.570279351 container died fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6 (image=quay.io/ceph/ceph:v20, name=suspicious_wing, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-d8652d559628f04af2c3ee4f098ff5f5f58aeee442b79793aeb1ff92c5bd81de-merged.mount: Deactivated successfully.
Dec 03 21:09:26 compute-0 podman[90179]: 2025-12-03 21:09:26.364338957 +0000 UTC m=+1.635742579 container remove fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6 (image=quay.io/ceph/ceph:v20, name=suspicious_wing, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:26 compute-0 systemd[1]: libpod-conmon-fcc47cdc07c0382dc0656ae82187f6c347773c621910eda474f72d3742d8eaf6.scope: Deactivated successfully.
Dec 03 21:09:26 compute-0 sudo[90150]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:26 compute-0 podman[90366]: 2025-12-03 21:09:26.455860377 +0000 UTC m=+0.051390391 container create d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_wescoff, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:09:26 compute-0 systemd[1]: Started libpod-conmon-d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06.scope.
Dec 03 21:09:26 compute-0 podman[90366]: 2025-12-03 21:09:26.428465058 +0000 UTC m=+0.023995082 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:26 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:26 compute-0 sudo[90410]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnfxitlegcnhrrfsiendhmbhokkfautp ; /usr/bin/python3'
Dec 03 21:09:26 compute-0 sudo[90410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:26 compute-0 podman[90366]: 2025-12-03 21:09:26.553532463 +0000 UTC m=+0.149062527 container init d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_wescoff, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:09:26 compute-0 podman[90366]: 2025-12-03 21:09:26.562713491 +0000 UTC m=+0.158243505 container start d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_wescoff, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:26 compute-0 podman[90366]: 2025-12-03 21:09:26.566697663 +0000 UTC m=+0.162227677 container attach d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_wescoff, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:26 compute-0 elegant_wescoff[90402]: 167 167
Dec 03 21:09:26 compute-0 systemd[1]: libpod-d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06.scope: Deactivated successfully.
Dec 03 21:09:26 compute-0 podman[90366]: 2025-12-03 21:09:26.568941349 +0000 UTC m=+0.164471363 container died d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_wescoff, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:09:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-ddeb74f34573c270bee8ae1e30328389ccf50e4cc10a8f0d01f859cebc45604a-merged.mount: Deactivated successfully.
Dec 03 21:09:26 compute-0 podman[90366]: 2025-12-03 21:09:26.628083077 +0000 UTC m=+0.223613061 container remove d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_wescoff, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:26 compute-0 systemd[1]: libpod-conmon-d4daf438336c624ac52d81b4d91e7d71c9b29a356bd90059b7ed918586dbba06.scope: Deactivated successfully.
Dec 03 21:09:26 compute-0 python3[90412]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create volumes  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e16 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:26 compute-0 podman[90428]: 2025-12-03 21:09:26.772261583 +0000 UTC m=+0.064116691 container create 4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e (image=quay.io/ceph/ceph:v20, name=elegant_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:09:26 compute-0 systemd[1]: Started libpod-conmon-4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e.scope.
Dec 03 21:09:26 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79be4df1520491daa333f3fee306ec0da24416a052e44b909d4d6b99aa62cce8/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79be4df1520491daa333f3fee306ec0da24416a052e44b909d4d6b99aa62cce8/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:26 compute-0 podman[90445]: 2025-12-03 21:09:26.828996113 +0000 UTC m=+0.048528543 container create 99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_payne, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3)
Dec 03 21:09:26 compute-0 podman[90428]: 2025-12-03 21:09:26.842950449 +0000 UTC m=+0.134805667 container init 4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e (image=quay.io/ceph/ceph:v20, name=elegant_goldstine, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:09:26 compute-0 podman[90428]: 2025-12-03 21:09:26.75255547 +0000 UTC m=+0.044410598 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:26 compute-0 podman[90428]: 2025-12-03 21:09:26.856473674 +0000 UTC m=+0.148328782 container start 4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e (image=quay.io/ceph/ceph:v20, name=elegant_goldstine, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3)
Dec 03 21:09:26 compute-0 podman[90428]: 2025-12-03 21:09:26.860192761 +0000 UTC m=+0.152047909 container attach 4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e (image=quay.io/ceph/ceph:v20, name=elegant_goldstine, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:26 compute-0 systemd[1]: Started libpod-conmon-99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd.scope.
Dec 03 21:09:26 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf70c8d3d73d7ccb030155250a930e6cab91941bb14cab2858c287ee05a7c63d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf70c8d3d73d7ccb030155250a930e6cab91941bb14cab2858c287ee05a7c63d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf70c8d3d73d7ccb030155250a930e6cab91941bb14cab2858c287ee05a7c63d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:26 compute-0 podman[90445]: 2025-12-03 21:09:26.80782769 +0000 UTC m=+0.027360170 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cf70c8d3d73d7ccb030155250a930e6cab91941bb14cab2858c287ee05a7c63d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:26 compute-0 podman[90445]: 2025-12-03 21:09:26.910978418 +0000 UTC m=+0.130510928 container init 99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_payne, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:09:26 compute-0 podman[90445]: 2025-12-03 21:09:26.916662145 +0000 UTC m=+0.136194615 container start 99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_payne, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:09:26 compute-0 podman[90445]: 2025-12-03 21:09:26.92036952 +0000 UTC m=+0.139901990 container attach 99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_payne, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:27 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 16 pg[2.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [2] r=0 lpr=16 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:09:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v40: 2 pgs: 1 unknown, 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:27 compute-0 beautiful_payne[90469]: {
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:     "0": [
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:         {
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "devices": [
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "/dev/loop3"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             ],
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_name": "ceph_lv0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_size": "21470642176",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "name": "ceph_lv0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "tags": {
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.crush_device_class": "",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.encrypted": "0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osd_id": "0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.type": "block",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.vdo": "0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.with_tpm": "0"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             },
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "type": "block",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "vg_name": "ceph_vg0"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:         }
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:     ],
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:     "1": [
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:         {
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "devices": [
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "/dev/loop4"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             ],
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_name": "ceph_lv1",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_size": "21470642176",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "name": "ceph_lv1",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "tags": {
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.crush_device_class": "",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.encrypted": "0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osd_id": "1",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.type": "block",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.vdo": "0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.with_tpm": "0"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             },
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "type": "block",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "vg_name": "ceph_vg1"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:         }
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:     ],
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:     "2": [
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:         {
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "devices": [
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "/dev/loop5"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             ],
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_name": "ceph_lv2",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_size": "21470642176",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "name": "ceph_lv2",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "tags": {
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.crush_device_class": "",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.encrypted": "0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osd_id": "2",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.type": "block",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.vdo": "0",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:                 "ceph.with_tpm": "0"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             },
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "type": "block",
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:             "vg_name": "ceph_vg2"
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:         }
Dec 03 21:09:27 compute-0 beautiful_payne[90469]:     ]
Dec 03 21:09:27 compute-0 beautiful_payne[90469]: }
Dec 03 21:09:27 compute-0 systemd[1]: libpod-99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd.scope: Deactivated successfully.
Dec 03 21:09:27 compute-0 podman[90445]: 2025-12-03 21:09:27.230663751 +0000 UTC m=+0.450196191 container died 99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_payne, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:09:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0)
Dec 03 21:09:27 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/662270332' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e16 do_prune osdmap full prune enabled
Dec 03 21:09:27 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3946086394' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:27 compute-0 ceph-mon[75204]: osdmap e16: 3 total, 3 up, 3 in
Dec 03 21:09:27 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/662270332' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:27 compute-0 podman[90445]: 2025-12-03 21:09:27.274739803 +0000 UTC m=+0.494272243 container remove 99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_payne, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3)
Dec 03 21:09:27 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/662270332' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e17 e17: 3 total, 3 up, 3 in
Dec 03 21:09:27 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e17: 3 total, 3 up, 3 in
Dec 03 21:09:27 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 17 pg[2.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=0/0 les/c/f=0/0/0 sis=16) [2] r=0 lpr=16 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:09:27 compute-0 elegant_goldstine[90461]: pool 'volumes' created
Dec 03 21:09:27 compute-0 systemd[1]: libpod-conmon-99d9d726353097ca0ac7c9d1949672f521a9f924181374ae8c4f3e8ecc6db0cd.scope: Deactivated successfully.
Dec 03 21:09:27 compute-0 systemd[1]: libpod-4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e.scope: Deactivated successfully.
Dec 03 21:09:27 compute-0 podman[90428]: 2025-12-03 21:09:27.315297891 +0000 UTC m=+0.607153049 container died 4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e (image=quay.io/ceph/ceph:v20, name=elegant_goldstine, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:27 compute-0 sudo[90318]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-cf70c8d3d73d7ccb030155250a930e6cab91941bb14cab2858c287ee05a7c63d-merged.mount: Deactivated successfully.
Dec 03 21:09:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-79be4df1520491daa333f3fee306ec0da24416a052e44b909d4d6b99aa62cce8-merged.mount: Deactivated successfully.
Dec 03 21:09:27 compute-0 podman[90428]: 2025-12-03 21:09:27.359715639 +0000 UTC m=+0.651570747 container remove 4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e (image=quay.io/ceph/ceph:v20, name=elegant_goldstine, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:27 compute-0 systemd[1]: libpod-conmon-4b116689e25850e8d0981a017b8567f51fc4f4bb04e08a83ce4a1710fca6892e.scope: Deactivated successfully.
Dec 03 21:09:27 compute-0 sudo[90410]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:27 compute-0 sudo[90520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:27 compute-0 sudo[90520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:27 compute-0 sudo[90520]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:27 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 17 pg[3.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [1] r=0 lpr=17 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:09:27 compute-0 sudo[90549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:09:27 compute-0 sudo[90549]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:27 compute-0 sudo[90597]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pncdzfogsvyhrfhcrpqpwzoslevkwbpc ; /usr/bin/python3'
Dec 03 21:09:27 compute-0 sudo[90597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:27 compute-0 python3[90599]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create backups  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:27 compute-0 podman[90600]: 2025-12-03 21:09:27.700447292 +0000 UTC m=+0.065144333 container create f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8 (image=quay.io/ceph/ceph:v20, name=unruffled_leakey, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 03 21:09:27 compute-0 podman[90623]: 2025-12-03 21:09:27.727993965 +0000 UTC m=+0.041015809 container create 10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_blackburn, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 03 21:09:27 compute-0 systemd[1]: Started libpod-conmon-f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8.scope.
Dec 03 21:09:27 compute-0 systemd[1]: Started libpod-conmon-10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981.scope.
Dec 03 21:09:27 compute-0 podman[90600]: 2025-12-03 21:09:27.677414801 +0000 UTC m=+0.042111952 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22cd3eb5bf7e9ac48539436650c1b50e44f6bc5d82ca06b16434423ffb956412/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22cd3eb5bf7e9ac48539436650c1b50e44f6bc5d82ca06b16434423ffb956412/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:27 compute-0 podman[90600]: 2025-12-03 21:09:27.790760248 +0000 UTC m=+0.155457319 container init f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8 (image=quay.io/ceph/ceph:v20, name=unruffled_leakey, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:27 compute-0 podman[90600]: 2025-12-03 21:09:27.79724013 +0000 UTC m=+0.161937181 container start f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8 (image=quay.io/ceph/ceph:v20, name=unruffled_leakey, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:27 compute-0 podman[90623]: 2025-12-03 21:09:27.801097139 +0000 UTC m=+0.114118983 container init 10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_blackburn, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle)
Dec 03 21:09:27 compute-0 podman[90623]: 2025-12-03 21:09:27.806028519 +0000 UTC m=+0.119050363 container start 10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_blackburn, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True)
Dec 03 21:09:27 compute-0 awesome_blackburn[90645]: 167 167
Dec 03 21:09:27 compute-0 podman[90600]: 2025-12-03 21:09:27.809072652 +0000 UTC m=+0.173769703 container attach f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8 (image=quay.io/ceph/ceph:v20, name=unruffled_leakey, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:09:27 compute-0 systemd[1]: libpod-10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981.scope: Deactivated successfully.
Dec 03 21:09:27 compute-0 podman[90623]: 2025-12-03 21:09:27.712994218 +0000 UTC m=+0.026016082 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:27 compute-0 podman[90623]: 2025-12-03 21:09:27.812851289 +0000 UTC m=+0.125873133 container attach 10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_blackburn, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:27 compute-0 podman[90623]: 2025-12-03 21:09:27.813664886 +0000 UTC m=+0.126686730 container died 10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_blackburn, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:09:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-f4e9c0d8a7c7d63c57b6c876fe77a7e9ca51f0dee9ed00ae4fde1664a6b2c3a7-merged.mount: Deactivated successfully.
Dec 03 21:09:27 compute-0 podman[90623]: 2025-12-03 21:09:27.847755602 +0000 UTC m=+0.160777446 container remove 10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_blackburn, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle)
Dec 03 21:09:27 compute-0 systemd[1]: libpod-conmon-10c9eadb775eb7e7053f537857c90485acc5c4f35d186faf25f64f8e0d1c2981.scope: Deactivated successfully.
Dec 03 21:09:28 compute-0 podman[90689]: 2025-12-03 21:09:28.012762624 +0000 UTC m=+0.048285837 container create 6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_bouman, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:09:28 compute-0 systemd[1]: Started libpod-conmon-6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a.scope.
Dec 03 21:09:28 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89367f72cd2e5095a360eac7c3f14bdfa26d0eb7f63501e94ee9304234f54ea5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89367f72cd2e5095a360eac7c3f14bdfa26d0eb7f63501e94ee9304234f54ea5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89367f72cd2e5095a360eac7c3f14bdfa26d0eb7f63501e94ee9304234f54ea5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89367f72cd2e5095a360eac7c3f14bdfa26d0eb7f63501e94ee9304234f54ea5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:28 compute-0 podman[90689]: 2025-12-03 21:09:27.989453038 +0000 UTC m=+0.024976281 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:28 compute-0 podman[90689]: 2025-12-03 21:09:28.092676528 +0000 UTC m=+0.128199721 container init 6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_bouman, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:28 compute-0 podman[90689]: 2025-12-03 21:09:28.10065126 +0000 UTC m=+0.136174463 container start 6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_bouman, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:28 compute-0 podman[90689]: 2025-12-03 21:09:28.104478179 +0000 UTC m=+0.140001392 container attach 6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_bouman, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 03 21:09:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0)
Dec 03 21:09:28 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3215483712' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e17 do_prune osdmap full prune enabled
Dec 03 21:09:28 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3215483712' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e18 e18: 3 total, 3 up, 3 in
Dec 03 21:09:28 compute-0 unruffled_leakey[90638]: pool 'backups' created
Dec 03 21:09:28 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e18: 3 total, 3 up, 3 in
Dec 03 21:09:28 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 18 pg[3.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [1] r=0 lpr=17 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:09:28 compute-0 ceph-mon[75204]: pgmap v40: 2 pgs: 1 unknown, 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:28 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/662270332' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:28 compute-0 ceph-mon[75204]: osdmap e17: 3 total, 3 up, 3 in
Dec 03 21:09:28 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3215483712' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:28 compute-0 systemd[1]: libpod-f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8.scope: Deactivated successfully.
Dec 03 21:09:28 compute-0 podman[90600]: 2025-12-03 21:09:28.325404254 +0000 UTC m=+0.690101415 container died f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8 (image=quay.io/ceph/ceph:v20, name=unruffled_leakey, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 03 21:09:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-22cd3eb5bf7e9ac48539436650c1b50e44f6bc5d82ca06b16434423ffb956412-merged.mount: Deactivated successfully.
Dec 03 21:09:28 compute-0 podman[90600]: 2025-12-03 21:09:28.39077957 +0000 UTC m=+0.755476631 container remove f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8 (image=quay.io/ceph/ceph:v20, name=unruffled_leakey, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:09:28 compute-0 sudo[90597]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:28 compute-0 systemd[1]: libpod-conmon-f41afada032f997a588d4b2a28606912598ba00676edfa41f98bc965324e31d8.scope: Deactivated successfully.
Dec 03 21:09:28 compute-0 sudo[90772]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxvrmwrxsgwmykddakvzqnwpujajlved ; /usr/bin/python3'
Dec 03 21:09:28 compute-0 sudo[90772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:28 compute-0 python3[90780]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create images  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:28 compute-0 podman[90814]: 2025-12-03 21:09:28.721166422 +0000 UTC m=+0.039771464 container create e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df (image=quay.io/ceph/ceph:v20, name=optimistic_mahavira, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:09:28 compute-0 systemd[1]: Started libpod-conmon-e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df.scope.
Dec 03 21:09:28 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ccb1907b3900174711e58c32db0aa519e62affc5bab111b8421bb7b911931a5/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ccb1907b3900174711e58c32db0aa519e62affc5bab111b8421bb7b911931a5/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:28 compute-0 podman[90814]: 2025-12-03 21:09:28.703390459 +0000 UTC m=+0.021995531 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:28 compute-0 podman[90814]: 2025-12-03 21:09:28.805080757 +0000 UTC m=+0.123685849 container init e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df (image=quay.io/ceph/ceph:v20, name=optimistic_mahavira, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:09:28 compute-0 podman[90814]: 2025-12-03 21:09:28.8175308 +0000 UTC m=+0.136135852 container start e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df (image=quay.io/ceph/ceph:v20, name=optimistic_mahavira, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:09:28 compute-0 podman[90814]: 2025-12-03 21:09:28.821988422 +0000 UTC m=+0.140593474 container attach e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df (image=quay.io/ceph/ceph:v20, name=optimistic_mahavira, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:09:28 compute-0 lvm[90846]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:09:28 compute-0 lvm[90846]: VG ceph_vg0 finished
Dec 03 21:09:28 compute-0 lvm[90848]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:09:28 compute-0 lvm[90848]: VG ceph_vg1 finished
Dec 03 21:09:28 compute-0 lvm[90850]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:09:28 compute-0 lvm[90850]: VG ceph_vg2 finished
Dec 03 21:09:28 compute-0 relaxed_bouman[90706]: {}
Dec 03 21:09:28 compute-0 systemd[1]: libpod-6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a.scope: Deactivated successfully.
Dec 03 21:09:28 compute-0 podman[90689]: 2025-12-03 21:09:28.969809993 +0000 UTC m=+1.005333186 container died 6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_bouman, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:28 compute-0 systemd[1]: libpod-6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a.scope: Consumed 1.376s CPU time.
Dec 03 21:09:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-89367f72cd2e5095a360eac7c3f14bdfa26d0eb7f63501e94ee9304234f54ea5-merged.mount: Deactivated successfully.
Dec 03 21:09:29 compute-0 podman[90689]: 2025-12-03 21:09:29.008529425 +0000 UTC m=+1.044052618 container remove 6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_bouman, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:29 compute-0 systemd[1]: libpod-conmon-6158c442b5cebbb00b48d256c858cd046b903f8130b7998abb3fa5cb97d40f3a.scope: Deactivated successfully.
Dec 03 21:09:29 compute-0 sudo[90549]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:29 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 18 pg[4.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:09:29 compute-0 sudo[90882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:09:29 compute-0 sudo[90882]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:29 compute-0 sudo[90882]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v43: 4 pgs: 3 unknown, 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0)
Dec 03 21:09:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3503672702' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e18 do_prune osdmap full prune enabled
Dec 03 21:09:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3503672702' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e19 e19: 3 total, 3 up, 3 in
Dec 03 21:09:29 compute-0 optimistic_mahavira[90839]: pool 'images' created
Dec 03 21:09:29 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e19: 3 total, 3 up, 3 in
Dec 03 21:09:29 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 19 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [2] r=0 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:09:29 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3215483712' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:29 compute-0 ceph-mon[75204]: osdmap e18: 3 total, 3 up, 3 in
Dec 03 21:09:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:29 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3503672702' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:29 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3503672702' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:29 compute-0 ceph-mon[75204]: osdmap e19: 3 total, 3 up, 3 in
Dec 03 21:09:29 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 19 pg[4.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:09:29 compute-0 systemd[1]: libpod-e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df.scope: Deactivated successfully.
Dec 03 21:09:29 compute-0 podman[90814]: 2025-12-03 21:09:29.320047961 +0000 UTC m=+0.638653023 container died e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df (image=quay.io/ceph/ceph:v20, name=optimistic_mahavira, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle)
Dec 03 21:09:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-1ccb1907b3900174711e58c32db0aa519e62affc5bab111b8421bb7b911931a5-merged.mount: Deactivated successfully.
Dec 03 21:09:29 compute-0 podman[90814]: 2025-12-03 21:09:29.367337337 +0000 UTC m=+0.685942399 container remove e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df (image=quay.io/ceph/ceph:v20, name=optimistic_mahavira, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:29 compute-0 systemd[1]: libpod-conmon-e81e3efb032b8192d7c9aabba1ea0d68ed6c0d5226cdbdd28fadc2251fd302df.scope: Deactivated successfully.
Dec 03 21:09:29 compute-0 sudo[90772]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:29 compute-0 sudo[90946]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-przrdzwhsvkouxzfkgkklcvgcsxvxsyz ; /usr/bin/python3'
Dec 03 21:09:29 compute-0 sudo[90946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:29 compute-0 python3[90948]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create cephfs.cephfs.meta  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:29 compute-0 podman[90949]: 2025-12-03 21:09:29.73985133 +0000 UTC m=+0.064766715 container create f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298 (image=quay.io/ceph/ceph:v20, name=bold_austin, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:09:29 compute-0 systemd[1]: Started libpod-conmon-f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298.scope.
Dec 03 21:09:29 compute-0 podman[90949]: 2025-12-03 21:09:29.715272087 +0000 UTC m=+0.040187552 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:29 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/164a08ad90a7a2c651afd1989ee4ee019c13822cc260beaf48909ee57096570e/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/164a08ad90a7a2c651afd1989ee4ee019c13822cc260beaf48909ee57096570e/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:29 compute-0 podman[90949]: 2025-12-03 21:09:29.853252587 +0000 UTC m=+0.178168062 container init f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298 (image=quay.io/ceph/ceph:v20, name=bold_austin, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True)
Dec 03 21:09:29 compute-0 podman[90949]: 2025-12-03 21:09:29.86415206 +0000 UTC m=+0.189067445 container start f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298 (image=quay.io/ceph/ceph:v20, name=bold_austin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:29 compute-0 podman[90949]: 2025-12-03 21:09:29.86761938 +0000 UTC m=+0.192534795 container attach f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298 (image=quay.io/ceph/ceph:v20, name=bold_austin, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3)
Dec 03 21:09:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e19 do_prune osdmap full prune enabled
Dec 03 21:09:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e20 e20: 3 total, 3 up, 3 in
Dec 03 21:09:30 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e20: 3 total, 3 up, 3 in
Dec 03 21:09:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0)
Dec 03 21:09:30 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/319641924' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:30 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 20 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [2] r=0 lpr=19 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:09:30 compute-0 ceph-mon[75204]: pgmap v43: 4 pgs: 3 unknown, 1 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:30 compute-0 ceph-mon[75204]: osdmap e20: 3 total, 3 up, 3 in
Dec 03 21:09:30 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/319641924' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v46: 5 pgs: 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e20 do_prune osdmap full prune enabled
Dec 03 21:09:31 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/319641924' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e21 e21: 3 total, 3 up, 3 in
Dec 03 21:09:31 compute-0 bold_austin[90964]: pool 'cephfs.cephfs.meta' created
Dec 03 21:09:31 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e21: 3 total, 3 up, 3 in
Dec 03 21:09:31 compute-0 systemd[1]: libpod-f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298.scope: Deactivated successfully.
Dec 03 21:09:31 compute-0 podman[90949]: 2025-12-03 21:09:31.345757048 +0000 UTC m=+1.670672463 container died f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298 (image=quay.io/ceph/ceph:v20, name=bold_austin, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-164a08ad90a7a2c651afd1989ee4ee019c13822cc260beaf48909ee57096570e-merged.mount: Deactivated successfully.
Dec 03 21:09:31 compute-0 podman[90949]: 2025-12-03 21:09:31.400623879 +0000 UTC m=+1.725539264 container remove f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298 (image=quay.io/ceph/ceph:v20, name=bold_austin, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle)
Dec 03 21:09:31 compute-0 systemd[1]: libpod-conmon-f45dae8ae11895c9efcae88b28783b9992a95f22a40ed4e037234ee9347fa298.scope: Deactivated successfully.
Dec 03 21:09:31 compute-0 sudo[90946]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:31 compute-0 sudo[91026]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieoidybhclqykfugyyioibyajhacuugb ; /usr/bin/python3'
Dec 03 21:09:31 compute-0 sudo[91026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:31 compute-0 python3[91028]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool create cephfs.cephfs.data  replicated_rule --autoscale-mode on _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e21 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:31 compute-0 podman[91029]: 2025-12-03 21:09:31.783996483 +0000 UTC m=+0.050829809 container create 98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10 (image=quay.io/ceph/ceph:v20, name=condescending_yalow, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:31 compute-0 systemd[1]: Started libpod-conmon-98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10.scope.
Dec 03 21:09:31 compute-0 podman[91029]: 2025-12-03 21:09:31.761786799 +0000 UTC m=+0.028620145 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:31 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43a6d3a35d85bab64c3e5fb22fe52c13c83732fecf2ae50c255e755e2912142a/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43a6d3a35d85bab64c3e5fb22fe52c13c83732fecf2ae50c255e755e2912142a/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:31 compute-0 podman[91029]: 2025-12-03 21:09:31.879425314 +0000 UTC m=+0.146258660 container init 98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10 (image=quay.io/ceph/ceph:v20, name=condescending_yalow, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:09:31 compute-0 podman[91029]: 2025-12-03 21:09:31.886919457 +0000 UTC m=+0.153752813 container start 98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10 (image=quay.io/ceph/ceph:v20, name=condescending_yalow, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:09:31 compute-0 podman[91029]: 2025-12-03 21:09:31.891205685 +0000 UTC m=+0.158039101 container attach 98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10 (image=quay.io/ceph/ceph:v20, name=condescending_yalow, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:09:32 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 21 pg[6.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:09:32 compute-0 ceph-mon[75204]: pgmap v46: 5 pgs: 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:32 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/319641924' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:32 compute-0 ceph-mon[75204]: osdmap e21: 3 total, 3 up, 3 in
Dec 03 21:09:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e21 do_prune osdmap full prune enabled
Dec 03 21:09:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e22 e22: 3 total, 3 up, 3 in
Dec 03 21:09:32 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e22: 3 total, 3 up, 3 in
Dec 03 21:09:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} v 0)
Dec 03 21:09:32 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3331846641' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:32 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 22 pg[6.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:09:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v49: 6 pgs: 1 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e22 do_prune osdmap full prune enabled
Dec 03 21:09:33 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3331846641' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e23 e23: 3 total, 3 up, 3 in
Dec 03 21:09:33 compute-0 condescending_yalow[91045]: pool 'cephfs.cephfs.data' created
Dec 03 21:09:33 compute-0 ceph-mon[75204]: osdmap e22: 3 total, 3 up, 3 in
Dec 03 21:09:33 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3331846641' entity='client.admin' cmd={"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"} : dispatch
Dec 03 21:09:33 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e23: 3 total, 3 up, 3 in
Dec 03 21:09:33 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 23 pg[7.0( empty local-lis/les=0/0 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [1] r=0 lpr=23 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:09:33 compute-0 systemd[1]: libpod-98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10.scope: Deactivated successfully.
Dec 03 21:09:33 compute-0 podman[91029]: 2025-12-03 21:09:33.382773127 +0000 UTC m=+1.649606473 container died 98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10 (image=quay.io/ceph/ceph:v20, name=condescending_yalow, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:09:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-43a6d3a35d85bab64c3e5fb22fe52c13c83732fecf2ae50c255e755e2912142a-merged.mount: Deactivated successfully.
Dec 03 21:09:33 compute-0 podman[91029]: 2025-12-03 21:09:33.436458574 +0000 UTC m=+1.703291910 container remove 98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10 (image=quay.io/ceph/ceph:v20, name=condescending_yalow, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 03 21:09:33 compute-0 systemd[1]: libpod-conmon-98105a1a9647bc4a48453dd0ebf6db2b40448b4f58c0cc58b1ef49ada418ba10.scope: Deactivated successfully.
Dec 03 21:09:33 compute-0 sudo[91026]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:33 compute-0 sudo[91107]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmnmuwavlgxdxqhrwufchninvrjkgnxv ; /usr/bin/python3'
Dec 03 21:09:33 compute-0 sudo[91107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:33 compute-0 python3[91109]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable vms rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:33 compute-0 podman[91110]: 2025-12-03 21:09:33.927822576 +0000 UTC m=+0.061076589 container create b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386 (image=quay.io/ceph/ceph:v20, name=nice_albattani, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:09:33 compute-0 systemd[1]: Started libpod-conmon-b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386.scope.
Dec 03 21:09:33 compute-0 podman[91110]: 2025-12-03 21:09:33.902808635 +0000 UTC m=+0.036062738 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0304529cd862cdd724b2058e7c2d2f47178c9de204f4a57a0f5d98e2cfaae6a5/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0304529cd862cdd724b2058e7c2d2f47178c9de204f4a57a0f5d98e2cfaae6a5/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:34 compute-0 podman[91110]: 2025-12-03 21:09:34.022366257 +0000 UTC m=+0.155620310 container init b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386 (image=quay.io/ceph/ceph:v20, name=nice_albattani, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:09:34 compute-0 podman[91110]: 2025-12-03 21:09:34.032511745 +0000 UTC m=+0.165765758 container start b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386 (image=quay.io/ceph/ceph:v20, name=nice_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:09:34 compute-0 podman[91110]: 2025-12-03 21:09:34.044837677 +0000 UTC m=+0.178091690 container attach b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386 (image=quay.io/ceph/ceph:v20, name=nice_albattani, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 03 21:09:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e23 do_prune osdmap full prune enabled
Dec 03 21:09:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e24 e24: 3 total, 3 up, 3 in
Dec 03 21:09:34 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e24: 3 total, 3 up, 3 in
Dec 03 21:09:34 compute-0 ceph-mon[75204]: pgmap v49: 6 pgs: 1 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:34 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3331846641' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Dec 03 21:09:34 compute-0 ceph-mon[75204]: osdmap e23: 3 total, 3 up, 3 in
Dec 03 21:09:34 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 24 pg[7.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [1] r=0 lpr=23 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:09:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"} v 0)
Dec 03 21:09:34 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3010214953' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"} : dispatch
Dec 03 21:09:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v52: 7 pgs: 2 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e24 do_prune osdmap full prune enabled
Dec 03 21:09:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3010214953' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Dec 03 21:09:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e25 e25: 3 total, 3 up, 3 in
Dec 03 21:09:35 compute-0 nice_albattani[91125]: enabled application 'rbd' on pool 'vms'
Dec 03 21:09:35 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e25: 3 total, 3 up, 3 in
Dec 03 21:09:35 compute-0 ceph-mon[75204]: osdmap e24: 3 total, 3 up, 3 in
Dec 03 21:09:35 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3010214953' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"} : dispatch
Dec 03 21:09:35 compute-0 systemd[1]: libpod-b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386.scope: Deactivated successfully.
Dec 03 21:09:35 compute-0 podman[91110]: 2025-12-03 21:09:35.399223815 +0000 UTC m=+1.532477858 container died b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386 (image=quay.io/ceph/ceph:v20, name=nice_albattani, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle)
Dec 03 21:09:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-0304529cd862cdd724b2058e7c2d2f47178c9de204f4a57a0f5d98e2cfaae6a5-merged.mount: Deactivated successfully.
Dec 03 21:09:35 compute-0 podman[91110]: 2025-12-03 21:09:35.442478829 +0000 UTC m=+1.575732882 container remove b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386 (image=quay.io/ceph/ceph:v20, name=nice_albattani, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:35 compute-0 systemd[1]: libpod-conmon-b3bb9742da48102bc3b83ea2bf085d0529efbe0ee844728d36e1860b3bb90386.scope: Deactivated successfully.
Dec 03 21:09:35 compute-0 sudo[91107]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:35 compute-0 sudo[91185]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlxukvguydezfzexbsazvyjgdvrfgota ; /usr/bin/python3'
Dec 03 21:09:35 compute-0 sudo[91185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:35 compute-0 python3[91187]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable volumes rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:35 compute-0 podman[91188]: 2025-12-03 21:09:35.824992437 +0000 UTC m=+0.065199334 container create 294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c (image=quay.io/ceph/ceph:v20, name=hopeful_wilson, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:35 compute-0 systemd[1]: Started libpod-conmon-294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c.scope.
Dec 03 21:09:35 compute-0 podman[91188]: 2025-12-03 21:09:35.791227157 +0000 UTC m=+0.031434094 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:35 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e69572a64d0c43f7ca3380e3e083fc2e1c044d1da2a81b872cb1df0fe473e059/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e69572a64d0c43f7ca3380e3e083fc2e1c044d1da2a81b872cb1df0fe473e059/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:35 compute-0 podman[91188]: 2025-12-03 21:09:35.912337921 +0000 UTC m=+0.152544878 container init 294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c (image=quay.io/ceph/ceph:v20, name=hopeful_wilson, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:35 compute-0 podman[91188]: 2025-12-03 21:09:35.92210489 +0000 UTC m=+0.162311797 container start 294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c (image=quay.io/ceph/ceph:v20, name=hopeful_wilson, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:09:35 compute-0 podman[91188]: 2025-12-03 21:09:35.929346259 +0000 UTC m=+0.169553196 container attach 294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c (image=quay.io/ceph/ceph:v20, name=hopeful_wilson, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"} v 0)
Dec 03 21:09:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3790791762' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"} : dispatch
Dec 03 21:09:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e25 do_prune osdmap full prune enabled
Dec 03 21:09:36 compute-0 ceph-mon[75204]: pgmap v52: 7 pgs: 2 unknown, 5 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:36 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3010214953' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Dec 03 21:09:36 compute-0 ceph-mon[75204]: osdmap e25: 3 total, 3 up, 3 in
Dec 03 21:09:36 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3790791762' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"} : dispatch
Dec 03 21:09:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3790791762' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec 03 21:09:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e26 e26: 3 total, 3 up, 3 in
Dec 03 21:09:36 compute-0 hopeful_wilson[91203]: enabled application 'rbd' on pool 'volumes'
Dec 03 21:09:36 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e26: 3 total, 3 up, 3 in
Dec 03 21:09:36 compute-0 systemd[1]: libpod-294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c.scope: Deactivated successfully.
Dec 03 21:09:36 compute-0 podman[91228]: 2025-12-03 21:09:36.471389947 +0000 UTC m=+0.034273003 container died 294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c (image=quay.io/ceph/ceph:v20, name=hopeful_wilson, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:09:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-e69572a64d0c43f7ca3380e3e083fc2e1c044d1da2a81b872cb1df0fe473e059-merged.mount: Deactivated successfully.
Dec 03 21:09:36 compute-0 podman[91228]: 2025-12-03 21:09:36.521296286 +0000 UTC m=+0.084179312 container remove 294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c (image=quay.io/ceph/ceph:v20, name=hopeful_wilson, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:09:36 compute-0 systemd[1]: libpod-conmon-294c8c6ea310d753c924e4118af2df15110800b4d546d7017cb404d494fc8c6c.scope: Deactivated successfully.
Dec 03 21:09:36 compute-0 sudo[91185]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:36 compute-0 sudo[91266]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejeknkamunvpuojeefcphmjsnjmgotvo ; /usr/bin/python3'
Dec 03 21:09:36 compute-0 sudo[91266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e26 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:36 compute-0 python3[91268]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable backups rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:36 compute-0 podman[91269]: 2025-12-03 21:09:36.964367631 +0000 UTC m=+0.071582774 container create d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf (image=quay.io/ceph/ceph:v20, name=adoring_mendel, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:09:37 compute-0 systemd[1]: Started libpod-conmon-d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf.scope.
Dec 03 21:09:37 compute-0 podman[91269]: 2025-12-03 21:09:36.935788476 +0000 UTC m=+0.043003629 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:37 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74ef1132576152a5f18c5265cc577e0714f82c837a9e162b873e93c070ba1db0/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74ef1132576152a5f18c5265cc577e0714f82c837a9e162b873e93c070ba1db0/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:37 compute-0 podman[91269]: 2025-12-03 21:09:37.067354405 +0000 UTC m=+0.174569568 container init d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf (image=quay.io/ceph/ceph:v20, name=adoring_mendel, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default)
Dec 03 21:09:37 compute-0 podman[91269]: 2025-12-03 21:09:37.075104674 +0000 UTC m=+0.182319787 container start d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf (image=quay.io/ceph/ceph:v20, name=adoring_mendel, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True)
Dec 03 21:09:37 compute-0 podman[91269]: 2025-12-03 21:09:37.078939952 +0000 UTC m=+0.186155155 container attach d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf (image=quay.io/ceph/ceph:v20, name=adoring_mendel, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:09:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v55: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:37 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3790791762' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec 03 21:09:37 compute-0 ceph-mon[75204]: osdmap e26: 3 total, 3 up, 3 in
Dec 03 21:09:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"} v 0)
Dec 03 21:09:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3350298550' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"} : dispatch
Dec 03 21:09:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e26 do_prune osdmap full prune enabled
Dec 03 21:09:38 compute-0 ceph-mon[75204]: pgmap v55: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:38 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3350298550' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"} : dispatch
Dec 03 21:09:38 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3350298550' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec 03 21:09:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e27 e27: 3 total, 3 up, 3 in
Dec 03 21:09:38 compute-0 adoring_mendel[91284]: enabled application 'rbd' on pool 'backups'
Dec 03 21:09:38 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e27: 3 total, 3 up, 3 in
Dec 03 21:09:38 compute-0 systemd[1]: libpod-d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf.scope: Deactivated successfully.
Dec 03 21:09:38 compute-0 podman[91269]: 2025-12-03 21:09:38.457886633 +0000 UTC m=+1.565101786 container died d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf (image=quay.io/ceph/ceph:v20, name=adoring_mendel, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-74ef1132576152a5f18c5265cc577e0714f82c837a9e162b873e93c070ba1db0-merged.mount: Deactivated successfully.
Dec 03 21:09:38 compute-0 podman[91269]: 2025-12-03 21:09:38.510472817 +0000 UTC m=+1.617687960 container remove d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf (image=quay.io/ceph/ceph:v20, name=adoring_mendel, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:09:38 compute-0 systemd[1]: libpod-conmon-d213637b430e61f59ca05a3f9d08b766e2a8b8f55545a0090de9ab1f93ffbbcf.scope: Deactivated successfully.
Dec 03 21:09:38 compute-0 sudo[91266]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:38 compute-0 sudo[91342]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxlatrzpankkbkahobzaiosnbkjkpbhz ; /usr/bin/python3'
Dec 03 21:09:38 compute-0 sudo[91342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:38 compute-0 python3[91344]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable images rbd _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:38 compute-0 podman[91345]: 2025-12-03 21:09:38.951187624 +0000 UTC m=+0.055361693 container create 4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4 (image=quay.io/ceph/ceph:v20, name=blissful_goldberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 03 21:09:38 compute-0 systemd[1]: Started libpod-conmon-4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4.scope.
Dec 03 21:09:39 compute-0 podman[91345]: 2025-12-03 21:09:38.92216705 +0000 UTC m=+0.026341209 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7663e56f9978917969dfd5974a0b6df01a7041aba8e283f1dadc72f3a9ce210/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7663e56f9978917969dfd5974a0b6df01a7041aba8e283f1dadc72f3a9ce210/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:39 compute-0 podman[91345]: 2025-12-03 21:09:39.040474499 +0000 UTC m=+0.144648608 container init 4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4 (image=quay.io/ceph/ceph:v20, name=blissful_goldberg, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:09:39 compute-0 podman[91345]: 2025-12-03 21:09:39.045674804 +0000 UTC m=+0.149848903 container start 4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4 (image=quay.io/ceph/ceph:v20, name=blissful_goldberg, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:39 compute-0 podman[91345]: 2025-12-03 21:09:39.04987489 +0000 UTC m=+0.154048989 container attach 4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4 (image=quay.io/ceph/ceph:v20, name=blissful_goldberg, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:09:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v57: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:39 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3350298550' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec 03 21:09:39 compute-0 ceph-mon[75204]: osdmap e27: 3 total, 3 up, 3 in
Dec 03 21:09:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "images", "app": "rbd"} v 0)
Dec 03 21:09:39 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2912358146' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "images", "app": "rbd"} : dispatch
Dec 03 21:09:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e27 do_prune osdmap full prune enabled
Dec 03 21:09:40 compute-0 ceph-mon[75204]: pgmap v57: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2912358146' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "images", "app": "rbd"} : dispatch
Dec 03 21:09:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2912358146' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec 03 21:09:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e28 e28: 3 total, 3 up, 3 in
Dec 03 21:09:40 compute-0 blissful_goldberg[91361]: enabled application 'rbd' on pool 'images'
Dec 03 21:09:40 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e28: 3 total, 3 up, 3 in
Dec 03 21:09:40 compute-0 systemd[1]: libpod-4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4.scope: Deactivated successfully.
Dec 03 21:09:40 compute-0 podman[91386]: 2025-12-03 21:09:40.526724991 +0000 UTC m=+0.027260148 container died 4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4 (image=quay.io/ceph/ceph:v20, name=blissful_goldberg, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-d7663e56f9978917969dfd5974a0b6df01a7041aba8e283f1dadc72f3a9ce210-merged.mount: Deactivated successfully.
Dec 03 21:09:40 compute-0 podman[91386]: 2025-12-03 21:09:40.568289541 +0000 UTC m=+0.068824588 container remove 4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4 (image=quay.io/ceph/ceph:v20, name=blissful_goldberg, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:40 compute-0 systemd[1]: libpod-conmon-4dc9952fa32780e8267a8db0171891a74b804bffb1fc84240685e030cfd1bcf4.scope: Deactivated successfully.
Dec 03 21:09:40 compute-0 sudo[91342]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:40 compute-0 sudo[91425]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okoomsyekkjzjhhswkwtbkeyvgzbuxpp ; /usr/bin/python3'
Dec 03 21:09:40 compute-0 sudo[91425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:40 compute-0 python3[91427]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable cephfs.cephfs.meta cephfs _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:40 compute-0 podman[91428]: 2025-12-03 21:09:40.964091469 +0000 UTC m=+0.073181046 container create 021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9 (image=quay.io/ceph/ceph:v20, name=laughing_williams, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:09:41 compute-0 systemd[1]: Started libpod-conmon-021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9.scope.
Dec 03 21:09:41 compute-0 podman[91428]: 2025-12-03 21:09:40.930304329 +0000 UTC m=+0.039393936 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:41 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a6019fbea4ee003eef82b1e5d7f1d7c259a18f7abdb8d1d0cae8ce633b5d133/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a6019fbea4ee003eef82b1e5d7f1d7c259a18f7abdb8d1d0cae8ce633b5d133/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:41 compute-0 podman[91428]: 2025-12-03 21:09:41.073062596 +0000 UTC m=+0.182152233 container init 021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9 (image=quay.io/ceph/ceph:v20, name=laughing_williams, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:41 compute-0 podman[91428]: 2025-12-03 21:09:41.078945706 +0000 UTC m=+0.188035273 container start 021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9 (image=quay.io/ceph/ceph:v20, name=laughing_williams, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:41 compute-0 podman[91428]: 2025-12-03 21:09:41.083376427 +0000 UTC m=+0.192466034 container attach 021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9 (image=quay.io/ceph/ceph:v20, name=laughing_williams, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:09:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v59: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:41 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2912358146' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec 03 21:09:41 compute-0 ceph-mon[75204]: osdmap e28: 3 total, 3 up, 3 in
Dec 03 21:09:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"} v 0)
Dec 03 21:09:41 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2642974806' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"} : dispatch
Dec 03 21:09:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e28 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e28 do_prune osdmap full prune enabled
Dec 03 21:09:42 compute-0 ceph-mon[75204]: pgmap v59: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:42 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2642974806' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"} : dispatch
Dec 03 21:09:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/2642974806' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec 03 21:09:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e29 e29: 3 total, 3 up, 3 in
Dec 03 21:09:42 compute-0 laughing_williams[91444]: enabled application 'cephfs' on pool 'cephfs.cephfs.meta'
Dec 03 21:09:42 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e29: 3 total, 3 up, 3 in
Dec 03 21:09:42 compute-0 systemd[1]: libpod-021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9.scope: Deactivated successfully.
Dec 03 21:09:42 compute-0 podman[91428]: 2025-12-03 21:09:42.520955266 +0000 UTC m=+1.630044803 container died 021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9 (image=quay.io/ceph/ceph:v20, name=laughing_williams, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:09:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-6a6019fbea4ee003eef82b1e5d7f1d7c259a18f7abdb8d1d0cae8ce633b5d133-merged.mount: Deactivated successfully.
Dec 03 21:09:42 compute-0 podman[91428]: 2025-12-03 21:09:42.741760289 +0000 UTC m=+1.850849856 container remove 021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9 (image=quay.io/ceph/ceph:v20, name=laughing_williams, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:09:42 compute-0 systemd[1]: libpod-conmon-021b13ecd467cd08702f8c1371605bd1187ca3f654f132d408630cb3c4ff08c9.scope: Deactivated successfully.
Dec 03 21:09:42 compute-0 sudo[91425]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:42 compute-0 sudo[91504]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfcmvnftmghcntvkzsywwptuqjyvnrol ; /usr/bin/python3'
Dec 03 21:09:42 compute-0 sudo[91504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:43 compute-0 python3[91506]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd pool application enable cephfs.cephfs.data cephfs _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v61: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:43 compute-0 podman[91507]: 2025-12-03 21:09:43.183493407 +0000 UTC m=+0.042367358 container create 890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5 (image=quay.io/ceph/ceph:v20, name=trusting_montalcini, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:09:43 compute-0 systemd[1]: Started libpod-conmon-890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5.scope.
Dec 03 21:09:43 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:43 compute-0 podman[91507]: 2025-12-03 21:09:43.164630771 +0000 UTC m=+0.023504742 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e2d1c1c5b33d6d91649d768fd9226eafbbd4c87fa032d31c3ed14fbf09379d2/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e2d1c1c5b33d6d91649d768fd9226eafbbd4c87fa032d31c3ed14fbf09379d2/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:43 compute-0 podman[91507]: 2025-12-03 21:09:43.279457308 +0000 UTC m=+0.138331299 container init 890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5 (image=quay.io/ceph/ceph:v20, name=trusting_montalcini, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:09:43 compute-0 podman[91507]: 2025-12-03 21:09:43.289938522 +0000 UTC m=+0.148812483 container start 890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5 (image=quay.io/ceph/ceph:v20, name=trusting_montalcini, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030)
Dec 03 21:09:43 compute-0 podman[91507]: 2025-12-03 21:09:43.294112617 +0000 UTC m=+0.152986578 container attach 890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5 (image=quay.io/ceph/ceph:v20, name=trusting_montalcini, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:09:43 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2642974806' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec 03 21:09:43 compute-0 ceph-mon[75204]: osdmap e29: 3 total, 3 up, 3 in
Dec 03 21:09:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"} v 0)
Dec 03 21:09:43 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/915051798' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"} : dispatch
Dec 03 21:09:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e29 do_prune osdmap full prune enabled
Dec 03 21:09:44 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/915051798' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec 03 21:09:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e30 e30: 3 total, 3 up, 3 in
Dec 03 21:09:44 compute-0 trusting_montalcini[91523]: enabled application 'cephfs' on pool 'cephfs.cephfs.data'
Dec 03 21:09:44 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e30: 3 total, 3 up, 3 in
Dec 03 21:09:44 compute-0 ceph-mon[75204]: pgmap v61: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:44 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/915051798' entity='client.admin' cmd={"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"} : dispatch
Dec 03 21:09:44 compute-0 systemd[1]: libpod-890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5.scope: Deactivated successfully.
Dec 03 21:09:44 compute-0 podman[91548]: 2025-12-03 21:09:44.570543183 +0000 UTC m=+0.023738977 container died 890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5 (image=quay.io/ceph/ceph:v20, name=trusting_montalcini, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default)
Dec 03 21:09:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-7e2d1c1c5b33d6d91649d768fd9226eafbbd4c87fa032d31c3ed14fbf09379d2-merged.mount: Deactivated successfully.
Dec 03 21:09:44 compute-0 podman[91548]: 2025-12-03 21:09:44.608775214 +0000 UTC m=+0.061970988 container remove 890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5 (image=quay.io/ceph/ceph:v20, name=trusting_montalcini, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:44 compute-0 systemd[1]: libpod-conmon-890e56f16dde7fa1ec7268c563668d8bd6bbaa254cd2c09ced77f4f2f3cacfb5.scope: Deactivated successfully.
Dec 03 21:09:44 compute-0 sudo[91504]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v63: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:45 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/915051798' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec 03 21:09:45 compute-0 ceph-mon[75204]: osdmap e30: 3 total, 3 up, 3 in
Dec 03 21:09:45 compute-0 ceph-mon[75204]: pgmap v63: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:46 compute-0 python3[91638]: ansible-ansible.legacy.stat Invoked with path=/tmp/ceph_mds.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 21:09:46 compute-0 python3[91709]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796186.0829332-36776-133040734642724/source dest=/tmp/ceph_mds.yml mode=0644 force=True follow=False _original_basename=ceph_mds.yml.j2 checksum=e359e26d9e42bc107a0de03375144cf8590b6f68 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:09:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:47 compute-0 sudo[91757]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhkmfbsmewsrdlapvmchegpdlsjlbuta ; /usr/bin/python3'
Dec 03 21:09:47 compute-0 sudo[91757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v64: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:47 compute-0 python3[91759]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /tmp/ceph_mds.yml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   fs volume create cephfs '--placement=compute-0 '
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:47 compute-0 podman[91760]: 2025-12-03 21:09:47.273562422 +0000 UTC m=+0.060637470 container create 79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a (image=quay.io/ceph/ceph:v20, name=jovial_villani, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 03 21:09:47 compute-0 systemd[1]: Started libpod-conmon-79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a.scope.
Dec 03 21:09:47 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e893d492b7f55d3ba56ad358af7638f766b13b53f8695eaf97217c44adcef1bf/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:47 compute-0 podman[91760]: 2025-12-03 21:09:47.252878999 +0000 UTC m=+0.039954097 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e893d492b7f55d3ba56ad358af7638f766b13b53f8695eaf97217c44adcef1bf/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e893d492b7f55d3ba56ad358af7638f766b13b53f8695eaf97217c44adcef1bf/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:47 compute-0 podman[91760]: 2025-12-03 21:09:47.362732494 +0000 UTC m=+0.149807562 container init 79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a (image=quay.io/ceph/ceph:v20, name=jovial_villani, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:09:47 compute-0 podman[91760]: 2025-12-03 21:09:47.37085703 +0000 UTC m=+0.157932078 container start 79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a (image=quay.io/ceph/ceph:v20, name=jovial_villani, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:09:47 compute-0 podman[91760]: 2025-12-03 21:09:47.374860722 +0000 UTC m=+0.161935790 container attach 79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a (image=quay.io/ceph/ceph:v20, name=jovial_villani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:09:47 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14230 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "compute-0 ", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:09:47 compute-0 ceph-mgr[75500]: [volumes INFO volumes.module] Starting _cmd_fs_volume_create(name:cephfs, placement:compute-0 , prefix:fs volume create, target:['mon-mgr', '']) < ""
Dec 03 21:09:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"} v 0)
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"} : dispatch
Dec 03 21:09:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"} v 0)
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"} : dispatch
Dec 03 21:09:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"} v 0)
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"} : dispatch
Dec 03 21:09:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e30 do_prune osdmap full prune enabled
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(cluster) log [WRN] : Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec 03 21:09:47 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0[75200]: 2025-12-03T21:09:47.859+0000 7f6ce6e09640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec 03 21:09:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e2 new map
Dec 03 21:09:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e2 print_map
                                           e2
                                           btime 2025-12-03T21:09:47:861269+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-03T21:09:47.860950+0000
                                           modified        2025-12-03T21:09:47.860950+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Dec 03 21:09:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 e31: 3 total, 3 up, 3 in
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e31: 3 total, 3 up, 3 in
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : fsmap cephfs:0
Dec 03 21:09:47 compute-0 ceph-mgr[75500]: [cephadm INFO root] Saving service mds.cephfs spec with placement compute-0
Dec 03 21:09:47 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Saving service mds.cephfs spec with placement compute-0
Dec 03 21:09:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0)
Dec 03 21:09:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:47 compute-0 ceph-mgr[75500]: [volumes INFO volumes.module] Finishing _cmd_fs_volume_create(name:cephfs, placement:compute-0 , prefix:fs volume create, target:['mon-mgr', '']) < ""
Dec 03 21:09:47 compute-0 systemd[1]: libpod-79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a.scope: Deactivated successfully.
Dec 03 21:09:47 compute-0 podman[91760]: 2025-12-03 21:09:47.916231626 +0000 UTC m=+0.703306674 container died 79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a (image=quay.io/ceph/ceph:v20, name=jovial_villani, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-e893d492b7f55d3ba56ad358af7638f766b13b53f8695eaf97217c44adcef1bf-merged.mount: Deactivated successfully.
Dec 03 21:09:47 compute-0 sudo[91800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:47 compute-0 sudo[91800]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:47 compute-0 podman[91760]: 2025-12-03 21:09:47.956897667 +0000 UTC m=+0.743972715 container remove 79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a (image=quay.io/ceph/ceph:v20, name=jovial_villani, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:47 compute-0 sudo[91800]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:47 compute-0 systemd[1]: libpod-conmon-79ac7de97b977ed2f279210e9f9f4176261410db8da1ce979cf352b291b5ed1a.scope: Deactivated successfully.
Dec 03 21:09:47 compute-0 sudo[91757]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:48 compute-0 sudo[91838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:09:48 compute-0 sudo[91838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:48 compute-0 sudo[91886]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrrhzzzmkozcdoplxfwgbufcbqlhvgnh ; /usr/bin/python3'
Dec 03 21:09:48 compute-0 sudo[91886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:48 compute-0 ceph-mon[75204]: pgmap v64: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"} : dispatch
Dec 03 21:09:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"} : dispatch
Dec 03 21:09:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"} : dispatch
Dec 03 21:09:48 compute-0 ceph-mon[75204]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec 03 21:09:48 compute-0 ceph-mon[75204]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec 03 21:09:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec 03 21:09:48 compute-0 ceph-mon[75204]: osdmap e31: 3 total, 3 up, 3 in
Dec 03 21:09:48 compute-0 ceph-mon[75204]: fsmap cephfs:0
Dec 03 21:09:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:48 compute-0 python3[91888]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z   --volume /tmp/ceph_mds.yml:/home/ceph_spec.yaml:z   --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch apply --in-file /home/ceph_spec.yaml _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:48 compute-0 podman[91905]: 2025-12-03 21:09:48.34409522 +0000 UTC m=+0.049800309 container create d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491 (image=quay.io/ceph/ceph:v20, name=nostalgic_khorana, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030)
Dec 03 21:09:48 compute-0 systemd[1]: Started libpod-conmon-d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491.scope.
Dec 03 21:09:48 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adf1019f02f920b9c4716f66f94920cf55969daffb7fad6d8702de4bb3fe4881/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adf1019f02f920b9c4716f66f94920cf55969daffb7fad6d8702de4bb3fe4881/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adf1019f02f920b9c4716f66f94920cf55969daffb7fad6d8702de4bb3fe4881/merged/home/ceph_spec.yaml supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:48 compute-0 podman[91905]: 2025-12-03 21:09:48.320455056 +0000 UTC m=+0.026160165 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:48 compute-0 podman[91905]: 2025-12-03 21:09:48.417656503 +0000 UTC m=+0.123361612 container init d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491 (image=quay.io/ceph/ceph:v20, name=nostalgic_khorana, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:09:48 compute-0 podman[91905]: 2025-12-03 21:09:48.424663656 +0000 UTC m=+0.130368745 container start d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491 (image=quay.io/ceph/ceph:v20, name=nostalgic_khorana, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:09:48 compute-0 podman[91905]: 2025-12-03 21:09:48.429071756 +0000 UTC m=+0.134776875 container attach d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491 (image=quay.io/ceph/ceph:v20, name=nostalgic_khorana, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:09:48 compute-0 podman[91949]: 2025-12-03 21:09:48.467492962 +0000 UTC m=+0.055812492 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:09:48 compute-0 podman[91949]: 2025-12-03 21:09:48.587970544 +0000 UTC m=+0.176290104 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:48 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14232 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:09:48 compute-0 ceph-mgr[75500]: [cephadm INFO root] Saving service mds.cephfs spec with placement compute-0
Dec 03 21:09:48 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Saving service mds.cephfs spec with placement compute-0
Dec 03 21:09:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0)
Dec 03 21:09:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:48 compute-0 nostalgic_khorana[91946]: Scheduled mds.cephfs update...
Dec 03 21:09:48 compute-0 systemd[1]: libpod-d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491.scope: Deactivated successfully.
Dec 03 21:09:48 compute-0 podman[91905]: 2025-12-03 21:09:48.854071532 +0000 UTC m=+0.559776621 container died d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491 (image=quay.io/ceph/ceph:v20, name=nostalgic_khorana, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:09:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-adf1019f02f920b9c4716f66f94920cf55969daffb7fad6d8702de4bb3fe4881-merged.mount: Deactivated successfully.
Dec 03 21:09:48 compute-0 podman[91905]: 2025-12-03 21:09:48.892373294 +0000 UTC m=+0.598078383 container remove d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491 (image=quay.io/ceph/ceph:v20, name=nostalgic_khorana, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:48 compute-0 systemd[1]: libpod-conmon-d565a0a2625fe5bc38cf741f326525219ac4d31bee60a8628a876a29d065e491.scope: Deactivated successfully.
Dec 03 21:09:48 compute-0 sudo[91886]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v66: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:49 compute-0 sudo[91838]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:49 compute-0 ceph-mon[75204]: from='client.14230 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "compute-0 ", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:09:49 compute-0 ceph-mon[75204]: Saving service mds.cephfs spec with placement compute-0
Dec 03 21:09:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:49 compute-0 sudo[92138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:49 compute-0 sudo[92138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:49 compute-0 sudo[92138]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:49 compute-0 sudo[92186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:09:49 compute-0 sudo[92186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:49 compute-0 sudo[92262]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frzxngiiorynlmonnvpxdulfqzobdidv ; /usr/bin/python3'
Dec 03 21:09:49 compute-0 sudo[92262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:49 compute-0 python3[92264]: ansible-ansible.legacy.stat Invoked with path=/etc/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 03 21:09:49 compute-0 sudo[92262]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:49 compute-0 sudo[92354]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daklulttgdjtlpcuimvrgjhbmgamhtto ; /usr/bin/python3'
Dec 03 21:09:49 compute-0 sudo[92354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:49 compute-0 sudo[92186]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:09:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:09:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:09:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:09:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:09:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:09:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:09:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:50 compute-0 python3[92356]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796189.2709978-36806-224620764690427/source dest=/etc/ceph/ceph.client.openstack.keyring mode=0644 force=True owner=167 group=167 follow=False _original_basename=ceph_key.j2 checksum=100907596fddba72a04e8a16770dbec161f9317a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:09:50 compute-0 sudo[92354]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:50 compute-0 sudo[92369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:50 compute-0 sudo[92369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:50 compute-0 sudo[92369]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:50 compute-0 sudo[92411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:09:50 compute-0 sudo[92411]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='client.14232 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:09:50 compute-0 ceph-mon[75204]: Saving service mds.cephfs spec with placement compute-0
Dec 03 21:09:50 compute-0 ceph-mon[75204]: pgmap v66: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:09:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:50 compute-0 sudo[92477]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcxdtshgxgovpjsxmrmnbpxvzeyiascr ; /usr/bin/python3'
Dec 03 21:09:50 compute-0 sudo[92477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:50 compute-0 podman[92481]: 2025-12-03 21:09:50.439682276 +0000 UTC m=+0.050324470 container create 5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_einstein, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:09:50 compute-0 systemd[1]: Started libpod-conmon-5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d.scope.
Dec 03 21:09:50 compute-0 python3[92480]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   auth import -i /etc/ceph/ceph.client.openstack.keyring _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:50 compute-0 podman[92481]: 2025-12-03 21:09:50.416284437 +0000 UTC m=+0.026926651 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:50 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:50 compute-0 podman[92481]: 2025-12-03 21:09:50.543105519 +0000 UTC m=+0.153747803 container init 5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_einstein, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:50 compute-0 podman[92481]: 2025-12-03 21:09:50.554616054 +0000 UTC m=+0.165258248 container start 5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_einstein, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 03 21:09:50 compute-0 podman[92481]: 2025-12-03 21:09:50.559859932 +0000 UTC m=+0.170502226 container attach 5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_einstein, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:50 compute-0 mystifying_einstein[92497]: 167 167
Dec 03 21:09:50 compute-0 systemd[1]: libpod-5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d.scope: Deactivated successfully.
Dec 03 21:09:50 compute-0 podman[92500]: 2025-12-03 21:09:50.565415475 +0000 UTC m=+0.048099863 container create 374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22 (image=quay.io/ceph/ceph:v20, name=keen_euclid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:50 compute-0 podman[92481]: 2025-12-03 21:09:50.570173592 +0000 UTC m=+0.180815826 container died 5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_einstein, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:09:50 compute-0 systemd[1]: Started libpod-conmon-374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22.scope.
Dec 03 21:09:50 compute-0 podman[92500]: 2025-12-03 21:09:50.543458616 +0000 UTC m=+0.026142984 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:50 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9040bfa689421ad13cbd409261471e0a026e8261204563f751916271015ddab/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9040bfa689421ad13cbd409261471e0a026e8261204563f751916271015ddab/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-056c266f4e704909bb8349b35eef7e9aff1de06c6302c86c00d182fdd8871b25-merged.mount: Deactivated successfully.
Dec 03 21:09:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v67: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:51 compute-0 podman[92481]: 2025-12-03 21:09:51.254955286 +0000 UTC m=+0.865597490 container remove 5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_einstein, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:51 compute-0 systemd[1]: libpod-conmon-5740b1215d7467655a6c54f13f602763e8ee3ec6f745dc4f258205ec8b4bf54d.scope: Deactivated successfully.
Dec 03 21:09:51 compute-0 podman[92500]: 2025-12-03 21:09:51.307197103 +0000 UTC m=+0.789881521 container init 374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22 (image=quay.io/ceph/ceph:v20, name=keen_euclid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:51 compute-0 podman[92500]: 2025-12-03 21:09:51.318964284 +0000 UTC m=+0.801648642 container start 374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22 (image=quay.io/ceph/ceph:v20, name=keen_euclid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:51 compute-0 podman[92500]: 2025-12-03 21:09:51.323054268 +0000 UTC m=+0.805738666 container attach 374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22 (image=quay.io/ceph/ceph:v20, name=keen_euclid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True)
Dec 03 21:09:51 compute-0 podman[92538]: 2025-12-03 21:09:51.504174949 +0000 UTC m=+0.053648087 container create fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=recursing_elion, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:51 compute-0 systemd[1]: Started libpod-conmon-fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321.scope.
Dec 03 21:09:51 compute-0 podman[92538]: 2025-12-03 21:09:51.482367513 +0000 UTC m=+0.031840611 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:51 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0309fa741b0f22f4ad240444fc8552e562fa2a168bd8ec4ba11bba7e36540e4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0309fa741b0f22f4ad240444fc8552e562fa2a168bd8ec4ba11bba7e36540e4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0309fa741b0f22f4ad240444fc8552e562fa2a168bd8ec4ba11bba7e36540e4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0309fa741b0f22f4ad240444fc8552e562fa2a168bd8ec4ba11bba7e36540e4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0309fa741b0f22f4ad240444fc8552e562fa2a168bd8ec4ba11bba7e36540e4/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:51 compute-0 podman[92538]: 2025-12-03 21:09:51.628978479 +0000 UTC m=+0.178451597 container init fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=recursing_elion, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:51 compute-0 podman[92538]: 2025-12-03 21:09:51.635964343 +0000 UTC m=+0.185437441 container start fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=recursing_elion, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 03 21:09:51 compute-0 podman[92538]: 2025-12-03 21:09:51.641142109 +0000 UTC m=+0.190615207 container attach fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=recursing_elion, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:09:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:09:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:09:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:09:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:09:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:09:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:09:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth import"} v 0)
Dec 03 21:09:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/218063241' entity='client.admin' cmd={"prefix": "auth import"} : dispatch
Dec 03 21:09:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/218063241' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec 03 21:09:51 compute-0 systemd[1]: libpod-374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22.scope: Deactivated successfully.
Dec 03 21:09:51 compute-0 podman[92500]: 2025-12-03 21:09:51.904354548 +0000 UTC m=+1.387038906 container died 374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22 (image=quay.io/ceph/ceph:v20, name=keen_euclid, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:09:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-c9040bfa689421ad13cbd409261471e0a026e8261204563f751916271015ddab-merged.mount: Deactivated successfully.
Dec 03 21:09:51 compute-0 podman[92500]: 2025-12-03 21:09:51.966969327 +0000 UTC m=+1.449653715 container remove 374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22 (image=quay.io/ceph/ceph:v20, name=keen_euclid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:09:51 compute-0 systemd[1]: libpod-conmon-374c7a614008d8c33432bf5e38faca3998a8e26e2c280cbcfa4c56df06633d22.scope: Deactivated successfully.
Dec 03 21:09:51 compute-0 sudo[92477]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:52 compute-0 recursing_elion[92573]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:09:52 compute-0 recursing_elion[92573]: --> All data devices are unavailable
Dec 03 21:09:52 compute-0 systemd[1]: libpod-fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321.scope: Deactivated successfully.
Dec 03 21:09:52 compute-0 podman[92538]: 2025-12-03 21:09:52.181992231 +0000 UTC m=+0.731465369 container died fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=recursing_elion, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-c0309fa741b0f22f4ad240444fc8552e562fa2a168bd8ec4ba11bba7e36540e4-merged.mount: Deactivated successfully.
Dec 03 21:09:52 compute-0 podman[92538]: 2025-12-03 21:09:52.255712378 +0000 UTC m=+0.805185486 container remove fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=recursing_elion, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle)
Dec 03 21:09:52 compute-0 ceph-mon[75204]: pgmap v67: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:52 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/218063241' entity='client.admin' cmd={"prefix": "auth import"} : dispatch
Dec 03 21:09:52 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/218063241' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec 03 21:09:52 compute-0 systemd[1]: libpod-conmon-fa38c3e1b1477b5ee5355e6ae935494353f818157c1d28d19ad4ac47dfe44321.scope: Deactivated successfully.
Dec 03 21:09:52 compute-0 sudo[92411]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:52 compute-0 sudo[92617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:52 compute-0 sudo[92617]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:52 compute-0 sudo[92617]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:52 compute-0 sudo[92642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:09:52 compute-0 sudo[92642]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:52 compute-0 sudo[92690]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ackvwkfogficodlagvwbdwtuwvnztkld ; /usr/bin/python3'
Dec 03 21:09:52 compute-0 sudo[92690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:52 compute-0 python3[92692]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   status --format json | jq .monmap.num_mons _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:52 compute-0 podman[92705]: 2025-12-03 21:09:52.800109483 +0000 UTC m=+0.062277154 container create 79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bose, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:52 compute-0 podman[92708]: 2025-12-03 21:09:52.811206269 +0000 UTC m=+0.055758760 container create edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b (image=quay.io/ceph/ceph:v20, name=peaceful_mirzakhani, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:52 compute-0 systemd[1]: Started libpod-conmon-79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5.scope.
Dec 03 21:09:52 compute-0 systemd[1]: Started libpod-conmon-edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b.scope.
Dec 03 21:09:52 compute-0 podman[92705]: 2025-12-03 21:09:52.767784993 +0000 UTC m=+0.029952764 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:52 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:52 compute-0 podman[92708]: 2025-12-03 21:09:52.777084202 +0000 UTC m=+0.021636703 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:52 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/704ca13af7f912bf6740fe02e5d08d606cea691ae52e802ccad5b25e41f5ab10/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/704ca13af7f912bf6740fe02e5d08d606cea691ae52e802ccad5b25e41f5ab10/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:52 compute-0 podman[92708]: 2025-12-03 21:09:52.887332886 +0000 UTC m=+0.131885387 container init edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b (image=quay.io/ceph/ceph:v20, name=peaceful_mirzakhani, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:09:52 compute-0 podman[92705]: 2025-12-03 21:09:52.89048207 +0000 UTC m=+0.152649761 container init 79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bose, CEPH_REF=tentacle, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:52 compute-0 podman[92708]: 2025-12-03 21:09:52.893967851 +0000 UTC m=+0.138520332 container start edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b (image=quay.io/ceph/ceph:v20, name=peaceful_mirzakhani, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:52 compute-0 podman[92705]: 2025-12-03 21:09:52.896192657 +0000 UTC m=+0.158360318 container start 79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bose, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030)
Dec 03 21:09:52 compute-0 podman[92708]: 2025-12-03 21:09:52.897075585 +0000 UTC m=+0.141628066 container attach edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b (image=quay.io/ceph/ceph:v20, name=peaceful_mirzakhani, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:09:52 compute-0 peaceful_bose[92738]: 167 167
Dec 03 21:09:52 compute-0 systemd[1]: libpod-79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5.scope: Deactivated successfully.
Dec 03 21:09:52 compute-0 podman[92705]: 2025-12-03 21:09:52.900094136 +0000 UTC m=+0.162261827 container attach 79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bose, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle)
Dec 03 21:09:52 compute-0 podman[92705]: 2025-12-03 21:09:52.900617277 +0000 UTC m=+0.162784938 container died 79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bose, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-3ed81245f586ef4a8e26d901ef0768ca3ca385113faf8c60d43fd4411743e431-merged.mount: Deactivated successfully.
Dec 03 21:09:52 compute-0 podman[92705]: 2025-12-03 21:09:52.930837185 +0000 UTC m=+0.193004846 container remove 79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bose, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:52 compute-0 systemd[1]: libpod-conmon-79f3a12728fb1a7ea6c7f432d930d4a43f3687b36ff747e71892518fd7e3e5c5.scope: Deactivated successfully.
Dec 03 21:09:53 compute-0 podman[92783]: 2025-12-03 21:09:53.083794491 +0000 UTC m=+0.040195243 container create ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_chaplygin, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 03 21:09:53 compute-0 systemd[1]: Started libpod-conmon-ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e.scope.
Dec 03 21:09:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v68: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:53 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:53 compute-0 podman[92783]: 2025-12-03 21:09:53.06416116 +0000 UTC m=+0.020561952 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cf80829b9283003a96a7bdeb6142ac5d6c48a2576de7b8cb529e7232616d01a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cf80829b9283003a96a7bdeb6142ac5d6c48a2576de7b8cb529e7232616d01a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cf80829b9283003a96a7bdeb6142ac5d6c48a2576de7b8cb529e7232616d01a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cf80829b9283003a96a7bdeb6142ac5d6c48a2576de7b8cb529e7232616d01a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:53 compute-0 podman[92783]: 2025-12-03 21:09:53.17530852 +0000 UTC m=+0.131709322 container init ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_chaplygin, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:09:53 compute-0 podman[92783]: 2025-12-03 21:09:53.186291025 +0000 UTC m=+0.142691817 container start ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_chaplygin, CEPH_REF=tentacle, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:09:53 compute-0 podman[92783]: 2025-12-03 21:09:53.190259407 +0000 UTC m=+0.146660179 container attach ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_chaplygin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:09:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 03 21:09:53 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/910190000' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 03 21:09:53 compute-0 peaceful_mirzakhani[92740]: 
Dec 03 21:09:53 compute-0 peaceful_mirzakhani[92740]: {"fsid":"c21de27e-a7fd-594b-8324-0697ba9aab3a","health":{"status":"HEALTH_ERR","checks":{"MDS_ALL_DOWN":{"severity":"HEALTH_ERR","summary":{"message":"1 filesystem is offline","count":1},"muted":false},"MDS_UP_LESS_THAN_MAX":{"severity":"HEALTH_WARN","summary":{"message":"1 filesystem is online with fewer MDS than max_mds","count":1},"muted":false}},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":111,"monmap":{"epoch":1,"min_mon_release_name":"tentacle","num_mons":1},"osdmap":{"epoch":31,"num_osds":3,"num_up_osds":3,"osd_up_since":1764796165,"num_in_osds":3,"osd_in_since":1764796141,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"active+clean","count":7}],"num_pgs":7,"num_pools":7,"num_objects":2,"data_bytes":459280,"bytes_used":83881984,"bytes_avail":64328044544,"bytes_total":64411926528},"fsmap":{"epoch":2,"btime":"2025-12-03T21:09:47:861269+0000","id":1,"up":0,"in":0,"max":1,"by_rank":[],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs"],"services":{}},"servicemap":{"epoch":2,"modified":"2025-12-03T21:09:23.137474+0000","services":{"osd":{"daemons":{"summary":"","0":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}},"1":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}}}}}},"progress_events":{}}
Dec 03 21:09:53 compute-0 systemd[1]: libpod-edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b.scope: Deactivated successfully.
Dec 03 21:09:53 compute-0 podman[92708]: 2025-12-03 21:09:53.382551676 +0000 UTC m=+0.627104167 container died edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b (image=quay.io/ceph/ceph:v20, name=peaceful_mirzakhani, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 03 21:09:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-704ca13af7f912bf6740fe02e5d08d606cea691ae52e802ccad5b25e41f5ab10-merged.mount: Deactivated successfully.
Dec 03 21:09:53 compute-0 podman[92708]: 2025-12-03 21:09:53.421035232 +0000 UTC m=+0.665587703 container remove edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b (image=quay.io/ceph/ceph:v20, name=peaceful_mirzakhani, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 03 21:09:53 compute-0 systemd[1]: libpod-conmon-edb29e53ba9e2c8c4f62aac063650f1075ffff88303af2011f12466c7f02e94b.scope: Deactivated successfully.
Dec 03 21:09:53 compute-0 sudo[92690]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]: {
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:     "0": [
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:         {
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "devices": [
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "/dev/loop3"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             ],
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_name": "ceph_lv0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_size": "21470642176",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "name": "ceph_lv0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "tags": {
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.crush_device_class": "",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.encrypted": "0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osd_id": "0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.type": "block",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.vdo": "0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.with_tpm": "0"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             },
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "type": "block",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "vg_name": "ceph_vg0"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:         }
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:     ],
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:     "1": [
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:         {
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "devices": [
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "/dev/loop4"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             ],
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_name": "ceph_lv1",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_size": "21470642176",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "name": "ceph_lv1",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "tags": {
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.crush_device_class": "",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.encrypted": "0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osd_id": "1",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.type": "block",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.vdo": "0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.with_tpm": "0"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             },
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "type": "block",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "vg_name": "ceph_vg1"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:         }
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:     ],
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:     "2": [
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:         {
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "devices": [
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "/dev/loop5"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             ],
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_name": "ceph_lv2",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_size": "21470642176",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "name": "ceph_lv2",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "tags": {
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.cluster_name": "ceph",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.crush_device_class": "",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.encrypted": "0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.objectstore": "bluestore",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osd_id": "2",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.type": "block",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.vdo": "0",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:                 "ceph.with_tpm": "0"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             },
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "type": "block",
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:             "vg_name": "ceph_vg2"
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:         }
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]:     ]
Dec 03 21:09:53 compute-0 vibrant_chaplygin[92799]: }
Dec 03 21:09:53 compute-0 systemd[1]: libpod-ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e.scope: Deactivated successfully.
Dec 03 21:09:53 compute-0 podman[92783]: 2025-12-03 21:09:53.538013693 +0000 UTC m=+0.494414515 container died ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_chaplygin, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:09:53 compute-0 podman[92783]: 2025-12-03 21:09:53.587002244 +0000 UTC m=+0.543403006 container remove ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_chaplygin, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:53 compute-0 systemd[1]: libpod-conmon-ce3a03a2b83e3e5fe1599f8ce3082bc1f63c120e1f24a21aa3fb118294befb9e.scope: Deactivated successfully.
Dec 03 21:09:53 compute-0 sudo[92854]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umaffielarlrctvcjrmriuzlnypcnkld ; /usr/bin/python3'
Dec 03 21:09:53 compute-0 sudo[92854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:53 compute-0 sudo[92642]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:53 compute-0 sudo[92857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:53 compute-0 sudo[92857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:53 compute-0 sudo[92857]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:53 compute-0 sudo[92882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:09:53 compute-0 sudo[92882]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-4cf80829b9283003a96a7bdeb6142ac5d6c48a2576de7b8cb529e7232616d01a-merged.mount: Deactivated successfully.
Dec 03 21:09:53 compute-0 python3[92856]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   mon dump --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:53 compute-0 podman[92907]: 2025-12-03 21:09:53.912376343 +0000 UTC m=+0.071959571 container create b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf (image=quay.io/ceph/ceph:v20, name=crazy_cartwright, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 03 21:09:53 compute-0 systemd[1]: Started libpod-conmon-b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf.scope.
Dec 03 21:09:53 compute-0 podman[92907]: 2025-12-03 21:09:53.886534325 +0000 UTC m=+0.046117593 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:53 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4590d4f6be376aaa545584357fea44b6465f8802dffd4f7455f1e2d52eb6da3/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c4590d4f6be376aaa545584357fea44b6465f8802dffd4f7455f1e2d52eb6da3/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:54 compute-0 podman[92907]: 2025-12-03 21:09:54.255817873 +0000 UTC m=+0.415401181 container init b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf (image=quay.io/ceph/ceph:v20, name=crazy_cartwright, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030)
Dec 03 21:09:54 compute-0 podman[92907]: 2025-12-03 21:09:54.262697493 +0000 UTC m=+0.422280751 container start b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf (image=quay.io/ceph/ceph:v20, name=crazy_cartwright, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True)
Dec 03 21:09:54 compute-0 podman[92907]: 2025-12-03 21:09:54.273645747 +0000 UTC m=+0.433228995 container attach b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf (image=quay.io/ceph/ceph:v20, name=crazy_cartwright, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:09:54 compute-0 ceph-mon[75204]: pgmap v68: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:54 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/910190000' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 03 21:09:54 compute-0 podman[92938]: 2025-12-03 21:09:54.395504367 +0000 UTC m=+0.371987423 container create c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_herschel, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:54 compute-0 systemd[1]: Started libpod-conmon-c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573.scope.
Dec 03 21:09:54 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:54 compute-0 podman[92938]: 2025-12-03 21:09:54.453426472 +0000 UTC m=+0.429909578 container init c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_herschel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:09:54 compute-0 podman[92938]: 2025-12-03 21:09:54.46022367 +0000 UTC m=+0.436706726 container start c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_herschel, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:09:54 compute-0 ecstatic_herschel[92974]: 167 167
Dec 03 21:09:54 compute-0 systemd[1]: libpod-c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573.scope: Deactivated successfully.
Dec 03 21:09:54 compute-0 podman[92938]: 2025-12-03 21:09:54.464393945 +0000 UTC m=+0.440877051 container attach c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_herschel, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:54 compute-0 podman[92938]: 2025-12-03 21:09:54.464763063 +0000 UTC m=+0.441246139 container died c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_herschel, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:09:54 compute-0 podman[92938]: 2025-12-03 21:09:54.378049801 +0000 UTC m=+0.354532907 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-66a0c8ec7c0e4f063e57a6f3d5e876729ca264f615fe685276abef8350d47ab2-merged.mount: Deactivated successfully.
Dec 03 21:09:54 compute-0 podman[92938]: 2025-12-03 21:09:54.510475507 +0000 UTC m=+0.486958583 container remove c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_herschel, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:09:54 compute-0 systemd[1]: libpod-conmon-c78068a68f3816f6f13cb086f385da093897ad47ba9f1b0b61d5965fc5d06573.scope: Deactivated successfully.
Dec 03 21:09:54 compute-0 podman[92999]: 2025-12-03 21:09:54.663103436 +0000 UTC m=+0.040579310 container create ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_mahavira, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:09:54 compute-0 systemd[1]: Started libpod-conmon-ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf.scope.
Dec 03 21:09:54 compute-0 podman[92999]: 2025-12-03 21:09:54.644422135 +0000 UTC m=+0.021898049 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 03 21:09:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/978046922' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:09:54 compute-0 crazy_cartwright[92925]: 
Dec 03 21:09:54 compute-0 crazy_cartwright[92925]: {"epoch":1,"fsid":"c21de27e-a7fd-594b-8324-0697ba9aab3a","modified":"2025-12-03T21:07:57.000116Z","created":"2025-12-03T21:07:57.000116Z","min_mon_release":20,"min_mon_release_name":"tentacle","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef","squid","tentacle"],"optional":[]},"mons":[{"rank":0,"name":"compute-0","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.122.100:3300","nonce":0},{"type":"v1","addr":"192.168.122.100:6789","nonce":0}]},"addr":"192.168.122.100:6789/0","public_addr":"192.168.122.100:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]}
Dec 03 21:09:54 compute-0 crazy_cartwright[92925]: dumped monmap epoch 1
Dec 03 21:09:54 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff72b479d959da9c9768e73b58cb121d8c9d74bf9112d71493968e6fc03d29fa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff72b479d959da9c9768e73b58cb121d8c9d74bf9112d71493968e6fc03d29fa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff72b479d959da9c9768e73b58cb121d8c9d74bf9112d71493968e6fc03d29fa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff72b479d959da9c9768e73b58cb121d8c9d74bf9112d71493968e6fc03d29fa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:54 compute-0 systemd[1]: libpod-b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf.scope: Deactivated successfully.
Dec 03 21:09:54 compute-0 podman[92907]: 2025-12-03 21:09:54.783911315 +0000 UTC m=+0.943494553 container died b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf (image=quay.io/ceph/ceph:v20, name=crazy_cartwright, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:54 compute-0 podman[92999]: 2025-12-03 21:09:54.799381591 +0000 UTC m=+0.176857525 container init ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_mahavira, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:09:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-c4590d4f6be376aaa545584357fea44b6465f8802dffd4f7455f1e2d52eb6da3-merged.mount: Deactivated successfully.
Dec 03 21:09:54 compute-0 podman[92999]: 2025-12-03 21:09:54.809632231 +0000 UTC m=+0.187108095 container start ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_mahavira, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:09:54 compute-0 podman[92999]: 2025-12-03 21:09:54.821315719 +0000 UTC m=+0.198791673 container attach ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_mahavira, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 03 21:09:54 compute-0 podman[92907]: 2025-12-03 21:09:54.82674553 +0000 UTC m=+0.986328748 container remove b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf (image=quay.io/ceph/ceph:v20, name=crazy_cartwright, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:54 compute-0 systemd[1]: libpod-conmon-b7bb3e47733486777c20fd915ee2b6b1ac127a8fe1325ac9726ef8abe89ac3cf.scope: Deactivated successfully.
Dec 03 21:09:54 compute-0 sudo[92854]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v69: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:55 compute-0 sudo[93073]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggoltxksdupdgnjnhcxuthqsfvavdoyx ; /usr/bin/python3'
Dec 03 21:09:55 compute-0 sudo[93073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:55 compute-0 python3[93080]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   auth get client.openstack _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:55 compute-0 podman[93114]: 2025-12-03 21:09:55.370729737 +0000 UTC m=+0.022348698 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:55 compute-0 lvm[93139]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:09:55 compute-0 lvm[93139]: VG ceph_vg0 finished
Dec 03 21:09:55 compute-0 lvm[93142]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:09:55 compute-0 lvm[93142]: VG ceph_vg1 finished
Dec 03 21:09:55 compute-0 lvm[93144]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:09:55 compute-0 lvm[93144]: VG ceph_vg2 finished
Dec 03 21:09:55 compute-0 compassionate_mahavira[93015]: {}
Dec 03 21:09:55 compute-0 systemd[1]: libpod-ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf.scope: Deactivated successfully.
Dec 03 21:09:55 compute-0 systemd[1]: libpod-ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf.scope: Consumed 1.373s CPU time.
Dec 03 21:09:55 compute-0 podman[93114]: 2025-12-03 21:09:55.672387882 +0000 UTC m=+0.324006803 container create a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4 (image=quay.io/ceph/ceph:v20, name=compassionate_lewin, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:09:55 compute-0 podman[92999]: 2025-12-03 21:09:55.672945163 +0000 UTC m=+1.050421067 container died ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_mahavira, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True)
Dec 03 21:09:55 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/978046922' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:09:55 compute-0 systemd[1]: Started libpod-conmon-a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4.scope.
Dec 03 21:09:55 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fba48e3eb6a66c8b2feba6ffe873a6c0ae95e449f3bd5bef73fdb57e253d5ad/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fba48e3eb6a66c8b2feba6ffe873a6c0ae95e449f3bd5bef73fdb57e253d5ad/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-ff72b479d959da9c9768e73b58cb121d8c9d74bf9112d71493968e6fc03d29fa-merged.mount: Deactivated successfully.
Dec 03 21:09:55 compute-0 podman[93114]: 2025-12-03 21:09:55.793078959 +0000 UTC m=+0.444697880 container init a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4 (image=quay.io/ceph/ceph:v20, name=compassionate_lewin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:55 compute-0 podman[92999]: 2025-12-03 21:09:55.801833997 +0000 UTC m=+1.179309871 container remove ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_mahavira, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:55 compute-0 podman[93114]: 2025-12-03 21:09:55.802227395 +0000 UTC m=+0.453846316 container start a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4 (image=quay.io/ceph/ceph:v20, name=compassionate_lewin, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:55 compute-0 sudo[92882]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:55 compute-0 podman[93114]: 2025-12-03 21:09:55.974634829 +0000 UTC m=+0.626253790 container attach a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4 (image=quay.io/ceph/ceph:v20, name=compassionate_lewin, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:09:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:55 compute-0 systemd[1]: libpod-conmon-ac98c39d52de68d52f5492bdd2eb38394377e77d81831539ab0f877b896738cf.scope: Deactivated successfully.
Dec 03 21:09:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:56 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev 765805c2-a519-46f9-90b4-73dfda2b5520 (Updating mds.cephfs deployment (+1 -> 1))
Dec 03 21:09:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.gzkqle", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Dec 03 21:09:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.gzkqle", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 03 21:09:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.gzkqle", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 03 21:09:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:56 compute-0 ceph-mgr[75500]: [cephadm INFO cephadm.serve] Deploying daemon mds.cephfs.compute-0.gzkqle on compute-0
Dec 03 21:09:56 compute-0 ceph-mgr[75500]: log_channel(cephadm) log [INF] : Deploying daemon mds.cephfs.compute-0.gzkqle on compute-0
Dec 03 21:09:56 compute-0 sudo[93187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:56 compute-0 sudo[93187]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:56 compute-0 sudo[93187]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:56 compute-0 sudo[93212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 _orch deploy --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a
Dec 03 21:09:56 compute-0 sudo[93212]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.openstack"} v 0)
Dec 03 21:09:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='client.? 192.168.122.100:0/3575152023' entity='client.admin' cmd={"prefix": "auth get", "entity": "client.openstack"} : dispatch
Dec 03 21:09:56 compute-0 compassionate_lewin[93164]: [client.openstack]
Dec 03 21:09:56 compute-0 compassionate_lewin[93164]:         key = AQB5pjBpAAAAABAAKWIHAEu4Fcpg9BW4WoYnAg==
Dec 03 21:09:56 compute-0 compassionate_lewin[93164]:         caps mgr = "allow *"
Dec 03 21:09:56 compute-0 compassionate_lewin[93164]:         caps mon = "profile rbd"
Dec 03 21:09:56 compute-0 compassionate_lewin[93164]:         caps osd = "profile rbd pool=vms, profile rbd pool=volumes, profile rbd pool=backups, profile rbd pool=images, profile rbd pool=cephfs.cephfs.meta, profile rbd pool=cephfs.cephfs.data"
Dec 03 21:09:56 compute-0 systemd[1]: libpod-a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4.scope: Deactivated successfully.
Dec 03 21:09:56 compute-0 conmon[93164]: conmon a8648d6495de6ca6f7ce <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4.scope/container/memory.events
Dec 03 21:09:56 compute-0 podman[93114]: 2025-12-03 21:09:56.475632217 +0000 UTC m=+1.127251138 container died a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4 (image=quay.io/ceph/ceph:v20, name=compassionate_lewin, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-6fba48e3eb6a66c8b2feba6ffe873a6c0ae95e449f3bd5bef73fdb57e253d5ad-merged.mount: Deactivated successfully.
Dec 03 21:09:56 compute-0 podman[93114]: 2025-12-03 21:09:56.570996266 +0000 UTC m=+1.222615237 container remove a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4 (image=quay.io/ceph/ceph:v20, name=compassionate_lewin, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:56 compute-0 systemd[1]: libpod-conmon-a8648d6495de6ca6f7cebd944dc83911eb62d77acf1a84b8297c8a95325c24a4.scope: Deactivated successfully.
Dec 03 21:09:56 compute-0 sudo[93073]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:56 compute-0 ceph-mon[75204]: pgmap v69: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.gzkqle", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Dec 03 21:09:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.gzkqle", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 03 21:09:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:56 compute-0 ceph-mon[75204]: Deploying daemon mds.cephfs.compute-0.gzkqle on compute-0
Dec 03 21:09:56 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3575152023' entity='client.admin' cmd={"prefix": "auth get", "entity": "client.openstack"} : dispatch
Dec 03 21:09:56 compute-0 podman[93289]: 2025-12-03 21:09:56.686384734 +0000 UTC m=+0.047334098 container create ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_cannon, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:56 compute-0 systemd[1]: Started libpod-conmon-ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3.scope.
Dec 03 21:09:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:09:56 compute-0 podman[93289]: 2025-12-03 21:09:56.666307164 +0000 UTC m=+0.027256588 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:56 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:56 compute-0 podman[93289]: 2025-12-03 21:09:56.773533245 +0000 UTC m=+0.134482609 container init ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_cannon, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:56 compute-0 podman[93289]: 2025-12-03 21:09:56.779674601 +0000 UTC m=+0.140623965 container start ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_cannon, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030)
Dec 03 21:09:56 compute-0 nervous_cannon[93306]: 167 167
Dec 03 21:09:56 compute-0 podman[93289]: 2025-12-03 21:09:56.783045919 +0000 UTC m=+0.143995283 container attach ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_cannon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 03 21:09:56 compute-0 systemd[1]: libpod-ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3.scope: Deactivated successfully.
Dec 03 21:09:56 compute-0 podman[93289]: 2025-12-03 21:09:56.783975039 +0000 UTC m=+0.144924403 container died ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_cannon, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:09:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-6774216d207ff941eafcda8fa50a5f654e1faf0da3794bf55801f2233ca8a42a-merged.mount: Deactivated successfully.
Dec 03 21:09:56 compute-0 podman[93289]: 2025-12-03 21:09:56.819996214 +0000 UTC m=+0.180945578 container remove ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_cannon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:09:56 compute-0 systemd[1]: libpod-conmon-ed820a545b2b0787f00c74a6f79762264f1bf919c8e31757a9e44c2fc7016dc3.scope: Deactivated successfully.
Dec 03 21:09:56 compute-0 systemd[1]: Reloading.
Dec 03 21:09:56 compute-0 systemd-rc-local-generator[93349]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:09:56 compute-0 systemd-sysv-generator[93353]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:09:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v70: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:57 compute-0 systemd[1]: Reloading.
Dec 03 21:09:57 compute-0 systemd-rc-local-generator[93390]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:09:57 compute-0 systemd-sysv-generator[93396]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:09:57 compute-0 systemd[1]: Starting Ceph mds.cephfs.compute-0.gzkqle for c21de27e-a7fd-594b-8324-0697ba9aab3a...
Dec 03 21:09:57 compute-0 ceph-mon[75204]: pgmap v70: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:57 compute-0 podman[93510]: 2025-12-03 21:09:57.870640875 +0000 UTC m=+0.057503735 container create 696a375e6a5a7e1fe4808c2104075d852001912137351307176607b220314d9c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mds-cephfs-compute-0-gzkqle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:09:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1370250e29db15f8546b911b832718ee75670b93b2501a1ab31cb7db25291d5c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1370250e29db15f8546b911b832718ee75670b93b2501a1ab31cb7db25291d5c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1370250e29db15f8546b911b832718ee75670b93b2501a1ab31cb7db25291d5c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1370250e29db15f8546b911b832718ee75670b93b2501a1ab31cb7db25291d5c/merged/var/lib/ceph/mds/ceph-cephfs.compute-0.gzkqle supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:57 compute-0 podman[93510]: 2025-12-03 21:09:57.846728217 +0000 UTC m=+0.033591127 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:09:57 compute-0 podman[93510]: 2025-12-03 21:09:57.954757054 +0000 UTC m=+0.141619954 container init 696a375e6a5a7e1fe4808c2104075d852001912137351307176607b220314d9c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mds-cephfs-compute-0-gzkqle, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:09:57 compute-0 podman[93510]: 2025-12-03 21:09:57.970636639 +0000 UTC m=+0.157499519 container start 696a375e6a5a7e1fe4808c2104075d852001912137351307176607b220314d9c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mds-cephfs-compute-0-gzkqle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:09:57 compute-0 bash[93510]: 696a375e6a5a7e1fe4808c2104075d852001912137351307176607b220314d9c
Dec 03 21:09:57 compute-0 systemd[1]: Started Ceph mds.cephfs.compute-0.gzkqle for c21de27e-a7fd-594b-8324-0697ba9aab3a.
Dec 03 21:09:58 compute-0 ceph-mds[93586]: set uid:gid to 167:167 (ceph:ceph)
Dec 03 21:09:58 compute-0 ceph-mds[93586]: ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo), process ceph-mds, pid 2
Dec 03 21:09:58 compute-0 ceph-mds[93586]: main not setting numa affinity
Dec 03 21:09:58 compute-0 ceph-mds[93586]: pidfile_write: ignore empty --pid-file
Dec 03 21:09:58 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mds-cephfs-compute-0-gzkqle[93561]: starting mds.cephfs.compute-0.gzkqle at 
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle Updating MDS map to version 2 from mon.0
Dec 03 21:09:58 compute-0 sudo[93212]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:58 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev 765805c2-a519-46f9-90b4-73dfda2b5520 (Updating mds.cephfs deployment (+1 -> 1))
Dec 03 21:09:58 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event 765805c2-a519-46f9-90b4-73dfda2b5520 (Updating mds.cephfs deployment (+1 -> 1)) in 2 seconds
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config set, name=mds_join_fs}] v 0)
Dec 03 21:09:58 compute-0 sudo[93633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwhytzjqegrxquanqerrjhcqlhogelny ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764796197.6702933-36878-224517071162573/async_wrapper.py j892459367166 30 /home/zuul/.ansible/tmp/ansible-tmp-1764796197.6702933-36878-224517071162573/AnsiballZ_command.py _'
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:58 compute-0 sudo[93633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mds.cephfs}] v 0)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:58 compute-0 sudo[93636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:09:58 compute-0 sudo[93636]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:58 compute-0 sudo[93636]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:58 compute-0 sudo[93661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:58 compute-0 sudo[93661]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:58 compute-0 sudo[93661]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:58 compute-0 ansible-async_wrapper.py[93635]: Invoked with j892459367166 30 /home/zuul/.ansible/tmp/ansible-tmp-1764796197.6702933-36878-224517071162573/AnsiballZ_command.py _
Dec 03 21:09:58 compute-0 ansible-async_wrapper.py[93693]: Starting module and watcher
Dec 03 21:09:58 compute-0 ansible-async_wrapper.py[93693]: Start watching 93695 (30)
Dec 03 21:09:58 compute-0 ansible-async_wrapper.py[93695]: Start module (93695)
Dec 03 21:09:58 compute-0 ansible-async_wrapper.py[93635]: Return async_wrapper task started.
Dec 03 21:09:58 compute-0 sudo[93633]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:58 compute-0 sudo[93686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:09:58 compute-0 sudo[93686]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:58 compute-0 python3[93699]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:09:58 compute-0 podman[93716]: 2025-12-03 21:09:58.515058425 +0000 UTC m=+0.079018676 container create dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001 (image=quay.io/ceph/ceph:v20, name=goofy_babbage, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default)
Dec 03 21:09:58 compute-0 systemd[1]: Started libpod-conmon-dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001.scope.
Dec 03 21:09:58 compute-0 podman[93716]: 2025-12-03 21:09:58.478500818 +0000 UTC m=+0.042461129 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:09:58 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:09:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ab1ef4295c2fbcba9f6de81d10d2fb9312b853db1506f75006916f692ba934e/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ab1ef4295c2fbcba9f6de81d10d2fb9312b853db1506f75006916f692ba934e/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:09:58 compute-0 podman[93716]: 2025-12-03 21:09:58.636379465 +0000 UTC m=+0.200339796 container init dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001 (image=quay.io/ceph/ceph:v20, name=goofy_babbage, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True)
Dec 03 21:09:58 compute-0 podman[93716]: 2025-12-03 21:09:58.649514242 +0000 UTC m=+0.213474493 container start dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001 (image=quay.io/ceph/ceph:v20, name=goofy_babbage, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:58 compute-0 podman[93716]: 2025-12-03 21:09:58.653876432 +0000 UTC m=+0.217836673 container attach dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001 (image=quay.io/ceph/ceph:v20, name=goofy_babbage, CEPH_REF=tentacle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e3 new map
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e3 print_map
                                           e3
                                           btime 2025-12-03T21:09:58:698664+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-03T21:09:47.860950+0000
                                           modified        2025-12-03T21:09:47.860950+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.gzkqle{-1:14242} state up:standby seq 1 addr [v2:192.168.122.100:6814/3914722781,v1:192.168.122.100:6815/3914722781] compat {c=[1],r=[1],i=[1fff]}]
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle Updating MDS map to version 3 from mon.0
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle Monitors have assigned me to become a standby
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mds.? [v2:192.168.122.100:6814/3914722781,v1:192.168.122.100:6815/3914722781] up:boot
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e3 assigned standby [v2:192.168.122.100:6814/3914722781,v1:192.168.122.100:6815/3914722781] as mds.0
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : daemon mds.cephfs.compute-0.gzkqle assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : Cluster is now healthy
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : fsmap cephfs:0 1 up:standby
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds metadata", "who": "cephfs.compute-0.gzkqle"} v 0)
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mds metadata", "who": "cephfs.compute-0.gzkqle"} : dispatch
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e3 all = 0
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e4 new map
Dec 03 21:09:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e4 print_map
                                           e4
                                           btime 2025-12-03T21:09:58:707549+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-03T21:09:47.860950+0000
                                           modified        2025-12-03T21:09:58.707539+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14242}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-0.gzkqle{0:14242} state up:creating seq 1 addr [v2:192.168.122.100:6814/3914722781,v1:192.168.122.100:6815/3914722781] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle Updating MDS map to version 4 from mon.0
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=cephfs.compute-0.gzkqle=up:creating}
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.4 handle_mds_map I am now mds.0.4
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x1
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x100
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x600
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x601
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x602
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x603
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x604
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x605
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x606
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x607
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x608
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.cache creating system inode with ino:0x609
Dec 03 21:09:58 compute-0 ceph-mds[93586]: mds.0.4 creating_done
Dec 03 21:09:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [INF] : daemon mds.cephfs.compute-0.gzkqle is now active in filesystem cephfs as rank 0
Dec 03 21:09:58 compute-0 podman[93784]: 2025-12-03 21:09:58.844910516 +0000 UTC m=+0.073697307 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:09:58 compute-0 podman[93784]: 2025-12-03 21:09:58.99631774 +0000 UTC m=+0.225104501 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.vendor=CentOS)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mds.? [v2:192.168.122.100:6814/3914722781,v1:192.168.122.100:6815/3914722781] up:boot
Dec 03 21:09:59 compute-0 ceph-mon[75204]: daemon mds.cephfs.compute-0.gzkqle assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: Cluster is now healthy
Dec 03 21:09:59 compute-0 ceph-mon[75204]: fsmap cephfs:0 1 up:standby
Dec 03 21:09:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "mds metadata", "who": "cephfs.compute-0.gzkqle"} : dispatch
Dec 03 21:09:59 compute-0 ceph-mon[75204]: fsmap cephfs:1 {0=cephfs.compute-0.gzkqle=up:creating}
Dec 03 21:09:59 compute-0 ceph-mon[75204]: daemon mds.cephfs.compute-0.gzkqle is now active in filesystem cephfs as rank 0
Dec 03 21:09:59 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14244 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:09:59 compute-0 goofy_babbage[93744]: 
Dec 03 21:09:59 compute-0 goofy_babbage[93744]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec 03 21:09:59 compute-0 systemd[1]: libpod-dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001.scope: Deactivated successfully.
Dec 03 21:09:59 compute-0 podman[93716]: 2025-12-03 21:09:59.110465943 +0000 UTC m=+0.674426174 container died dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001 (image=quay.io/ceph/ceph:v20, name=goofy_babbage, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:09:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-7ab1ef4295c2fbcba9f6de81d10d2fb9312b853db1506f75006916f692ba934e-merged.mount: Deactivated successfully.
Dec 03 21:09:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v71: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:09:59 compute-0 podman[93716]: 2025-12-03 21:09:59.153162195 +0000 UTC m=+0.717122426 container remove dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001 (image=quay.io/ceph/ceph:v20, name=goofy_babbage, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle)
Dec 03 21:09:59 compute-0 systemd[1]: libpod-conmon-dcfe9b2c5f286e305b4cdc3fa215ad4f3372a547a227fcf155804e9034d6e001.scope: Deactivated successfully.
Dec 03 21:09:59 compute-0 ansible-async_wrapper.py[93695]: Module complete (93695)
Dec 03 21:09:59 compute-0 sudo[93988]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciwozprpuxlnprkypardmhrayecopvij ; /usr/bin/python3'
Dec 03 21:09:59 compute-0 sudo[93988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:59 compute-0 python3[93997]: ansible-ansible.legacy.async_status Invoked with jid=j892459367166.93635 mode=status _async_dir=/root/.ansible_async
Dec 03 21:09:59 compute-0 sudo[93988]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:59 compute-0 sudo[93686]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:09:59 compute-0 sudo[94080]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyespkwxkddumgviwtkaqygjvfjpvtpq ; /usr/bin/python3'
Dec 03 21:09:59 compute-0 sudo[94080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e5 new map
Dec 03 21:09:59 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle Updating MDS map to version 5 from mon.0
Dec 03 21:09:59 compute-0 ceph-mds[93586]: mds.0.4 handle_mds_map I am now mds.0.4
Dec 03 21:09:59 compute-0 ceph-mds[93586]: mds.0.4 handle_mds_map state change up:creating --> up:active
Dec 03 21:09:59 compute-0 ceph-mds[93586]: mds.0.4 recovery_done -- successful recovery!
Dec 03 21:09:59 compute-0 ceph-mds[93586]: mds.0.4 active_start
Dec 03 21:09:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e5 print_map
                                           e5
                                           btime 2025-12-03T21:09:59:713283+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-03T21:09:47.860950+0000
                                           modified        2025-12-03T21:09:59.713281+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=14242}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 14242 members: 14242
                                           [mds.cephfs.compute-0.gzkqle{0:14242} state up:active seq 2 join_fscid=1 addr [v2:192.168.122.100:6814/3914722781,v1:192.168.122.100:6815/3914722781] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : mds.? [v2:192.168.122.100:6814/3914722781,v1:192.168.122.100:6815/3914722781] up:active
Dec 03 21:09:59 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=cephfs.compute-0.gzkqle=up:active}
Dec 03 21:09:59 compute-0 sudo[94081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:09:59 compute-0 sudo[94081]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:59 compute-0 sudo[94081]: pam_unix(sudo:session): session closed for user root
Dec 03 21:09:59 compute-0 sudo[94112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:09:59 compute-0 sudo[94112]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:09:59 compute-0 python3[94090]: ansible-ansible.legacy.async_status Invoked with jid=j892459367166.93635 mode=cleanup _async_dir=/root/.ansible_async
Dec 03 21:09:59 compute-0 sudo[94080]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='client.14244 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:10:00 compute-0 ceph-mon[75204]: pgmap v71: 7 pgs: 7 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:10:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:10:00 compute-0 ceph-mon[75204]: mds.? [v2:192.168.122.100:6814/3914722781,v1:192.168.122.100:6815/3914722781] up:active
Dec 03 21:10:00 compute-0 ceph-mon[75204]: fsmap cephfs:1 {0=cephfs.compute-0.gzkqle=up:active}
Dec 03 21:10:00 compute-0 podman[94149]: 2025-12-03 21:10:00.113012201 +0000 UTC m=+0.061831364 container create f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_moore, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:10:00 compute-0 systemd[1]: Started libpod-conmon-f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51.scope.
Dec 03 21:10:00 compute-0 podman[94149]: 2025-12-03 21:10:00.08848124 +0000 UTC m=+0.037300393 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:00 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:00 compute-0 podman[94149]: 2025-12-03 21:10:00.214496215 +0000 UTC m=+0.163315428 container init f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_moore, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 03 21:10:00 compute-0 podman[94149]: 2025-12-03 21:10:00.22794072 +0000 UTC m=+0.176759883 container start f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_moore, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:10:00 compute-0 podman[94149]: 2025-12-03 21:10:00.233464863 +0000 UTC m=+0.182299676 container attach f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_moore, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:00 compute-0 serene_moore[94166]: 167 167
Dec 03 21:10:00 compute-0 systemd[1]: libpod-f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51.scope: Deactivated successfully.
Dec 03 21:10:00 compute-0 podman[94149]: 2025-12-03 21:10:00.235832642 +0000 UTC m=+0.184651825 container died f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_moore, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:10:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-22de79ed5dffd72224bca820dce6861ff6dc566b38cbce86276cbe23fe023983-merged.mount: Deactivated successfully.
Dec 03 21:10:00 compute-0 podman[94149]: 2025-12-03 21:10:00.280901553 +0000 UTC m=+0.229720686 container remove f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_moore, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:10:00 compute-0 sudo[94203]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtlpmgcdiwtqaownkinyfthyicpktbke ; /usr/bin/python3'
Dec 03 21:10:00 compute-0 sudo[94203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:00 compute-0 systemd[1]: libpod-conmon-f122a5cc9b1a0eba0e2a8cdc3af9db8e49087dc154ca01d289c1baadcc596a51.scope: Deactivated successfully.
Dec 03 21:10:00 compute-0 python3[94207]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch status --format json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:00 compute-0 podman[94214]: 2025-12-03 21:10:00.44598991 +0000 UTC m=+0.050962863 container create 10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_ellis, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:10:00 compute-0 podman[94227]: 2025-12-03 21:10:00.484845659 +0000 UTC m=+0.056081080 container create aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d (image=quay.io/ceph/ceph:v20, name=elegant_satoshi, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:10:00 compute-0 systemd[1]: Started libpod-conmon-10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045.scope.
Dec 03 21:10:00 compute-0 podman[94214]: 2025-12-03 21:10:00.425464931 +0000 UTC m=+0.030437904 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:00 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:00 compute-0 systemd[1]: Started libpod-conmon-aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d.scope.
Dec 03 21:10:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bebfa8646a815e8c7fe4004da4bd61e2904a81607cda449252b516ff7a88ad4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bebfa8646a815e8c7fe4004da4bd61e2904a81607cda449252b516ff7a88ad4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bebfa8646a815e8c7fe4004da4bd61e2904a81607cda449252b516ff7a88ad4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bebfa8646a815e8c7fe4004da4bd61e2904a81607cda449252b516ff7a88ad4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bebfa8646a815e8c7fe4004da4bd61e2904a81607cda449252b516ff7a88ad4/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:00 compute-0 podman[94214]: 2025-12-03 21:10:00.551327074 +0000 UTC m=+0.156300047 container init 10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_ellis, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:10:00 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc23dea4cd64f6a6781804282442521867f65e4c194a91a6a7f2d1ee152cc6a4/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc23dea4cd64f6a6781804282442521867f65e4c194a91a6a7f2d1ee152cc6a4/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:00 compute-0 podman[94214]: 2025-12-03 21:10:00.559037821 +0000 UTC m=+0.164010774 container start 10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_ellis, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030)
Dec 03 21:10:00 compute-0 podman[94227]: 2025-12-03 21:10:00.466669692 +0000 UTC m=+0.037905123 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:10:00 compute-0 podman[94214]: 2025-12-03 21:10:00.563005547 +0000 UTC m=+0.167978510 container attach 10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_ellis, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:00 compute-0 podman[94227]: 2025-12-03 21:10:00.57809273 +0000 UTC m=+0.149328191 container init aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d (image=quay.io/ceph/ceph:v20, name=elegant_satoshi, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:10:00 compute-0 podman[94227]: 2025-12-03 21:10:00.593038759 +0000 UTC m=+0.164274180 container start aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d (image=quay.io/ceph/ceph:v20, name=elegant_satoshi, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:10:00 compute-0 podman[94227]: 2025-12-03 21:10:00.596151022 +0000 UTC m=+0.167386443 container attach aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d (image=quay.io/ceph/ceph:v20, name=elegant_satoshi, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:10:01 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14246 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:10:01 compute-0 elegant_satoshi[94249]: 
Dec 03 21:10:01 compute-0 elegant_satoshi[94249]: {"available": true, "backend": "cephadm", "paused": false, "workers": 10}
Dec 03 21:10:01 compute-0 systemd[1]: libpod-aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d.scope: Deactivated successfully.
Dec 03 21:10:01 compute-0 podman[94227]: 2025-12-03 21:10:01.108819893 +0000 UTC m=+0.680055304 container died aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d (image=quay.io/ceph/ceph:v20, name=elegant_satoshi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-fc23dea4cd64f6a6781804282442521867f65e4c194a91a6a7f2d1ee152cc6a4-merged.mount: Deactivated successfully.
Dec 03 21:10:01 compute-0 sad_ellis[94241]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:10:01 compute-0 sad_ellis[94241]: --> All data devices are unavailable
Dec 03 21:10:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v72: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:01 compute-0 podman[94227]: 2025-12-03 21:10:01.152140231 +0000 UTC m=+0.723375652 container remove aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d (image=quay.io/ceph/ceph:v20, name=elegant_satoshi, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:01 compute-0 systemd[1]: libpod-conmon-aedd04ec8cb1c1c1f47543c232853453dbb6db4e60899a631c10a7331376412d.scope: Deactivated successfully.
Dec 03 21:10:01 compute-0 sudo[94203]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:01 compute-0 systemd[1]: libpod-10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045.scope: Deactivated successfully.
Dec 03 21:10:01 compute-0 podman[94214]: 2025-12-03 21:10:01.180981791 +0000 UTC m=+0.785954764 container died 10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_ellis, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 03 21:10:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-4bebfa8646a815e8c7fe4004da4bd61e2904a81607cda449252b516ff7a88ad4-merged.mount: Deactivated successfully.
Dec 03 21:10:01 compute-0 podman[94214]: 2025-12-03 21:10:01.23030035 +0000 UTC m=+0.835273303 container remove 10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_ellis, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:10:01 compute-0 systemd[1]: libpod-conmon-10a6be72d412f3b30050c4f0912ed4847ce376d6bce7f349e14a59fb67da0045.scope: Deactivated successfully.
Dec 03 21:10:01 compute-0 sudo[94112]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:01 compute-0 sudo[94314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:10:01 compute-0 sudo[94314]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:01 compute-0 sudo[94314]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:01 compute-0 sudo[94339]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:10:01 compute-0 sudo[94339]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:01 compute-0 podman[94377]: 2025-12-03 21:10:01.680236113 +0000 UTC m=+0.041580262 container create 4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_mclean, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030)
Dec 03 21:10:01 compute-0 systemd[1]: Started libpod-conmon-4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b.scope.
Dec 03 21:10:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:01 compute-0 podman[94377]: 2025-12-03 21:10:01.661945545 +0000 UTC m=+0.023289714 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:01 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:01 compute-0 podman[94377]: 2025-12-03 21:10:01.777125573 +0000 UTC m=+0.138469772 container init 4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_mclean, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:01 compute-0 ceph-mgr[75500]: [progress INFO root] Writing back 4 completed events
Dec 03 21:10:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 03 21:10:01 compute-0 podman[94377]: 2025-12-03 21:10:01.789064582 +0000 UTC m=+0.150408731 container start 4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_mclean, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:01 compute-0 podman[94377]: 2025-12-03 21:10:01.792810542 +0000 UTC m=+0.154154781 container attach 4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_mclean, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:10:01 compute-0 eager_mclean[94393]: 167 167
Dec 03 21:10:01 compute-0 systemd[1]: libpod-4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b.scope: Deactivated successfully.
Dec 03 21:10:01 compute-0 podman[94377]: 2025-12-03 21:10:01.794672431 +0000 UTC m=+0.156016640 container died 4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_mclean, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:10:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-f406ae5b18f923d2bf6149a15c19ec26d219b5e6b06468e3d1e569fb66376a2b-merged.mount: Deactivated successfully.
Dec 03 21:10:01 compute-0 podman[94377]: 2025-12-03 21:10:01.855758174 +0000 UTC m=+0.217102333 container remove 4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_mclean, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:01 compute-0 systemd[1]: libpod-conmon-4405fba53c416947f1ad61c93aa74182911dda92bdc69977072ceae788127d5b.scope: Deactivated successfully.
Dec 03 21:10:01 compute-0 sudo[94437]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttkxucpfjwmrrzkhqqaunlnkknvkwufy ; /usr/bin/python3'
Dec 03 21:10:01 compute-0 sudo[94437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:02 compute-0 podman[94445]: 2025-12-03 21:10:02.022896541 +0000 UTC m=+0.049853454 container create 82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_volhard, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:02 compute-0 python3[94439]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch ls --export -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:02 compute-0 systemd[1]: Started libpod-conmon-82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d.scope.
Dec 03 21:10:02 compute-0 podman[94445]: 2025-12-03 21:10:02.001253002 +0000 UTC m=+0.028209915 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:02 compute-0 podman[94459]: 2025-12-03 21:10:02.098189893 +0000 UTC m=+0.047212693 container create c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db (image=quay.io/ceph/ceph:v20, name=stupefied_poincare, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:02 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7b01352322ed5d62984b3c896d782745067f29827a591abbf145e30e9d4dca3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7b01352322ed5d62984b3c896d782745067f29827a591abbf145e30e9d4dca3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7b01352322ed5d62984b3c896d782745067f29827a591abbf145e30e9d4dca3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7b01352322ed5d62984b3c896d782745067f29827a591abbf145e30e9d4dca3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:02 compute-0 podman[94445]: 2025-12-03 21:10:02.11940689 +0000 UTC m=+0.146363833 container init 82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_volhard, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle)
Dec 03 21:10:02 compute-0 systemd[1]: Started libpod-conmon-c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db.scope.
Dec 03 21:10:02 compute-0 podman[94445]: 2025-12-03 21:10:02.127203158 +0000 UTC m=+0.154160051 container start 82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_volhard, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle)
Dec 03 21:10:02 compute-0 podman[94445]: 2025-12-03 21:10:02.131955255 +0000 UTC m=+0.158912168 container attach 82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_volhard, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:10:02 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50702146cb93b9c0787f85c5ec1692ca8685497eba0979dff255f833729f92c/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a50702146cb93b9c0787f85c5ec1692ca8685497eba0979dff255f833729f92c/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:02 compute-0 podman[94459]: 2025-12-03 21:10:02.168390969 +0000 UTC m=+0.117413859 container init c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db (image=quay.io/ceph/ceph:v20, name=stupefied_poincare, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:10:02 compute-0 podman[94459]: 2025-12-03 21:10:02.07375961 +0000 UTC m=+0.022782440 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:10:02 compute-0 podman[94459]: 2025-12-03 21:10:02.177607105 +0000 UTC m=+0.126629945 container start c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db (image=quay.io/ceph/ceph:v20, name=stupefied_poincare, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:02 compute-0 podman[94459]: 2025-12-03 21:10:02.182095084 +0000 UTC m=+0.131117974 container attach c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db (image=quay.io/ceph/ceph:v20, name=stupefied_poincare, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:10:02 compute-0 ceph-mon[75204]: from='client.14246 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:10:02 compute-0 ceph-mon[75204]: pgmap v72: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:02 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]: {
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:     "0": [
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:         {
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "devices": [
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "/dev/loop3"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             ],
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_name": "ceph_lv0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_size": "21470642176",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "name": "ceph_lv0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "tags": {
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cluster_name": "ceph",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.crush_device_class": "",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.encrypted": "0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.objectstore": "bluestore",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osd_id": "0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.type": "block",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.vdo": "0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.with_tpm": "0"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             },
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "type": "block",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "vg_name": "ceph_vg0"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:         }
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:     ],
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:     "1": [
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:         {
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "devices": [
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "/dev/loop4"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             ],
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_name": "ceph_lv1",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_size": "21470642176",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "name": "ceph_lv1",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "tags": {
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cluster_name": "ceph",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.crush_device_class": "",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.encrypted": "0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.objectstore": "bluestore",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osd_id": "1",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.type": "block",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.vdo": "0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.with_tpm": "0"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             },
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "type": "block",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "vg_name": "ceph_vg1"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:         }
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:     ],
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:     "2": [
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:         {
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "devices": [
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "/dev/loop5"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             ],
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_name": "ceph_lv2",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_size": "21470642176",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "name": "ceph_lv2",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "tags": {
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.cluster_name": "ceph",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.crush_device_class": "",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.encrypted": "0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.objectstore": "bluestore",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osd_id": "2",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.type": "block",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.vdo": "0",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:                 "ceph.with_tpm": "0"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             },
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "type": "block",
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:             "vg_name": "ceph_vg2"
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:         }
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]:     ]
Dec 03 21:10:02 compute-0 relaxed_volhard[94475]: }
Dec 03 21:10:02 compute-0 systemd[1]: libpod-82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d.scope: Deactivated successfully.
Dec 03 21:10:02 compute-0 podman[94509]: 2025-12-03 21:10:02.494494613 +0000 UTC m=+0.026438897 container died 82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_volhard, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:10:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-d7b01352322ed5d62984b3c896d782745067f29827a591abbf145e30e9d4dca3-merged.mount: Deactivated successfully.
Dec 03 21:10:02 compute-0 podman[94509]: 2025-12-03 21:10:02.592981425 +0000 UTC m=+0.124925739 container remove 82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_volhard, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:10:02 compute-0 systemd[1]: libpod-conmon-82ec1722127ae299a52d45151adf2a5dc205db6c28e92bbee53decb4cb67ff3d.scope: Deactivated successfully.
Dec 03 21:10:02 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14248 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:10:02 compute-0 sudo[94339]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:02 compute-0 stupefied_poincare[94481]: 
Dec 03 21:10:02 compute-0 stupefied_poincare[94481]: [{"placement": {"host_pattern": "*"}, "service_name": "crash", "service_type": "crash"}, {"placement": {"hosts": ["compute-0"]}, "service_id": "cephfs", "service_name": "mds.cephfs", "service_type": "mds"}, {"placement": {"hosts": ["compute-0"]}, "service_name": "mgr", "service_type": "mgr"}, {"placement": {"hosts": ["compute-0"]}, "service_name": "mon", "service_type": "mon"}, {"placement": {"hosts": ["compute-0"]}, "service_id": "default_drive_group", "service_name": "osd.default_drive_group", "service_type": "osd", "spec": {"data_devices": {"paths": ["/dev/ceph_vg0/ceph_lv0", "/dev/ceph_vg1/ceph_lv1", "/dev/ceph_vg2/ceph_lv2"]}, "filter_logic": "AND", "objectstore": "bluestore"}}]
Dec 03 21:10:02 compute-0 systemd[1]: libpod-c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db.scope: Deactivated successfully.
Dec 03 21:10:02 compute-0 podman[94459]: 2025-12-03 21:10:02.68634768 +0000 UTC m=+0.635370480 container died c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db (image=quay.io/ceph/ceph:v20, name=stupefied_poincare, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:10:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-a50702146cb93b9c0787f85c5ec1692ca8685497eba0979dff255f833729f92c-merged.mount: Deactivated successfully.
Dec 03 21:10:02 compute-0 sudo[94526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:10:02 compute-0 sudo[94526]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:02 compute-0 podman[94459]: 2025-12-03 21:10:02.73271015 +0000 UTC m=+0.681732950 container remove c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db (image=quay.io/ceph/ceph:v20, name=stupefied_poincare, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:02 compute-0 sudo[94526]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:02 compute-0 systemd[1]: libpod-conmon-c3e064ab20c10af36be7c7d6cbdf22a6e7a22913a43e641d957ad1f2565e51db.scope: Deactivated successfully.
Dec 03 21:10:02 compute-0 sudo[94437]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:02 compute-0 sudo[94562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:10:02 compute-0 sudo[94562]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:03 compute-0 podman[94597]: 2025-12-03 21:10:03.090671056 +0000 UTC m=+0.035067349 container create 84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_sanderson, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:03 compute-0 systemd[1]: Started libpod-conmon-84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf.scope.
Dec 03 21:10:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v73: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:03 compute-0 podman[94597]: 2025-12-03 21:10:03.074927125 +0000 UTC m=+0.019323438 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:03 compute-0 podman[94597]: 2025-12-03 21:10:03.176818528 +0000 UTC m=+0.121214831 container init 84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_sanderson, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 03 21:10:03 compute-0 podman[94597]: 2025-12-03 21:10:03.187166384 +0000 UTC m=+0.131562707 container start 84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_sanderson, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:03 compute-0 podman[94597]: 2025-12-03 21:10:03.191596203 +0000 UTC m=+0.135992506 container attach 84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_sanderson, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 03 21:10:03 compute-0 nervous_sanderson[94613]: 167 167
Dec 03 21:10:03 compute-0 systemd[1]: libpod-84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf.scope: Deactivated successfully.
Dec 03 21:10:03 compute-0 podman[94597]: 2025-12-03 21:10:03.194698046 +0000 UTC m=+0.139094419 container died 84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:10:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-9aa8d0ee6222cb136c17db159405ea05988525ffd2677588ca996d4a48347ad3-merged.mount: Deactivated successfully.
Dec 03 21:10:03 compute-0 podman[94597]: 2025-12-03 21:10:03.241082855 +0000 UTC m=+0.185479188 container remove 84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_sanderson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:03 compute-0 ansible-async_wrapper.py[93693]: Done in kid B.
Dec 03 21:10:03 compute-0 systemd[1]: libpod-conmon-84596d629f7f0172fc3fc006a5c80428cb75ac5729e0f94e705b3fa292da28bf.scope: Deactivated successfully.
Dec 03 21:10:03 compute-0 podman[94637]: 2025-12-03 21:10:03.496378888 +0000 UTC m=+0.068723858 container create dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=blissful_brattain, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:10:03 compute-0 systemd[1]: Started libpod-conmon-dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb.scope.
Dec 03 21:10:03 compute-0 podman[94637]: 2025-12-03 21:10:03.467181327 +0000 UTC m=+0.039526387 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:03 compute-0 sudo[94679]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmsipevzvubydbcwgyoyhmhbczgvxdgm ; /usr/bin/python3'
Dec 03 21:10:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c8a86f9c9b1ea49d4ffcc6a7a2b5409aae37adc3bdd5cbbfc22897c8d3e070e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:03 compute-0 sudo[94679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c8a86f9c9b1ea49d4ffcc6a7a2b5409aae37adc3bdd5cbbfc22897c8d3e070e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c8a86f9c9b1ea49d4ffcc6a7a2b5409aae37adc3bdd5cbbfc22897c8d3e070e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c8a86f9c9b1ea49d4ffcc6a7a2b5409aae37adc3bdd5cbbfc22897c8d3e070e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:03 compute-0 podman[94637]: 2025-12-03 21:10:03.60609526 +0000 UTC m=+0.178440310 container init dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=blissful_brattain, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:10:03 compute-0 podman[94637]: 2025-12-03 21:10:03.618367587 +0000 UTC m=+0.190712597 container start dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=blissful_brattain, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:10:03 compute-0 podman[94637]: 2025-12-03 21:10:03.623663569 +0000 UTC m=+0.196008549 container attach dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=blissful_brattain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:10:03 compute-0 ceph-mds[93586]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec 03 21:10:03 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mds-cephfs-compute-0-gzkqle[93561]: 2025-12-03T21:10:03.729+0000 7f5195f57640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Dec 03 21:10:03 compute-0 python3[94682]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   orch ps -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:03 compute-0 podman[94685]: 2025-12-03 21:10:03.865560433 +0000 UTC m=+0.078017135 container create 7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6 (image=quay.io/ceph/ceph:v20, name=hopeful_shirley, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle)
Dec 03 21:10:03 compute-0 systemd[1]: Started libpod-conmon-7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6.scope.
Dec 03 21:10:03 compute-0 podman[94685]: 2025-12-03 21:10:03.829834569 +0000 UTC m=+0.042291361 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:10:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/208048ec0fab4deb0ae96b8055b23eeccd37d9f0a2cd07f1177ddd01f5650586/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/208048ec0fab4deb0ae96b8055b23eeccd37d9f0a2cd07f1177ddd01f5650586/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:03 compute-0 podman[94685]: 2025-12-03 21:10:03.955288531 +0000 UTC m=+0.167745253 container init 7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6 (image=quay.io/ceph/ceph:v20, name=hopeful_shirley, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:03 compute-0 podman[94685]: 2025-12-03 21:10:03.960553252 +0000 UTC m=+0.173009954 container start 7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6 (image=quay.io/ceph/ceph:v20, name=hopeful_shirley, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:10:03 compute-0 podman[94685]: 2025-12-03 21:10:03.964297322 +0000 UTC m=+0.176754074 container attach 7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6 (image=quay.io/ceph/ceph:v20, name=hopeful_shirley, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle)
Dec 03 21:10:04 compute-0 ceph-mon[75204]: from='client.14248 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:10:04 compute-0 ceph-mon[75204]: pgmap v73: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:04 compute-0 lvm[94794]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:10:04 compute-0 lvm[94797]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:10:04 compute-0 lvm[94797]: VG ceph_vg1 finished
Dec 03 21:10:04 compute-0 lvm[94794]: VG ceph_vg0 finished
Dec 03 21:10:04 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14250 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:10:04 compute-0 hopeful_shirley[94710]: 
Dec 03 21:10:04 compute-0 hopeful_shirley[94710]: [{"container_id": "4b1e1515111c", "container_image_digests": ["quay.io/ceph/ceph@sha256:4c65c801a8e5e5704934118b2c723e7233f2b5de8552bfc8f129dabe1fced0b1", "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86"], "container_image_id": "524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3", "container_image_name": "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86", "cpu_percentage": "0.25%", "created": "2025-12-03T21:08:44.736340Z", "daemon_id": "compute-0", "daemon_name": "crash.compute-0", "daemon_type": "crash", "events": ["2025-12-03T21:08:44.799038Z daemon:crash.compute-0 [INFO] \"Deployed crash.compute-0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-03T21:09:59.658310Z", "memory_usage": 7803502, "pending_daemon_config": false, "ports": [], "service_name": "crash", "started": "2025-12-03T21:08:44.620473Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@crash.compute-0", "version": "20.2.0"}, {"container_id": "696a375e6a5a", "container_image_digests": ["quay.io/ceph/ceph@sha256:4c65c801a8e5e5704934118b2c723e7233f2b5de8552bfc8f129dabe1fced0b1", "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86"], "container_image_id": "524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3", "container_image_name": "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86", "cpu_percentage": "9.87%", "created": "2025-12-03T21:09:57.986899Z", "daemon_id": "cephfs.compute-0.gzkqle", "daemon_name": "mds.cephfs.compute-0.gzkqle", "daemon_type": "mds", "events": ["2025-12-03T21:09:58.076640Z daemon:mds.cephfs.compute-0.gzkqle [INFO] \"Deployed mds.cephfs.compute-0.gzkqle on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-03T21:09:59.658609Z", "memory_usage": 17846763, "pending_daemon_config": false, "ports": [], "service_name": "mds.cephfs", "started": "2025-12-03T21:09:57.852688Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@mds.cephfs.compute-0.gzkqle", "version": "20.2.0"}, {"container_id": "3ad5fa1a42ad", "container_image_digests": ["quay.io/ceph/ceph@sha256:4c65c801a8e5e5704934118b2c723e7233f2b5de8552bfc8f129dabe1fced0b1", "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86"], "container_image_id": "524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3", "container_image_name": "quay.io/ceph/ceph:v20", "cpu_percentage": "17.74%", "created": "2025-12-03T21:08:03.769341Z", "daemon_id": "compute-0.jxauqt", "daemon_name": "mgr.compute-0.jxauqt", "daemon_type": "mgr", "events": ["2025-12-03T21:08:49.588947Z daemon:mgr.compute-0.jxauqt [INFO] \"Reconfigured mgr.compute-0.jxauqt on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-03T21:09:59.658242Z", "memory_usage": 545574092, "pending_daemon_config": false, "ports": [9283, 8765], "service_name": "mgr", "started": "2025-12-03T21:08:03.636215Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@mgr.compute-0.jxauqt", "version": "20.2.0"}, {"container_id": "5be1cf87f445", "container_image_digests": ["quay.io/ceph/ceph@sha256:4c65c801a8e5e5704934118b2c723e7233f2b5de8552bfc8f129dabe1fced0b1", "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86"], "container_image_id": "524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3", "container_image_name": "quay.io/ceph/ceph:v20", "cpu_percentage": "2.81%", "created": "2025-12-03T21:07:59.265229Z", "daemon_id": "compute-0", "daemon_name": "mon.compute-0", "daemon_type": "mon", "events": ["2025-12-03T21:08:48.987227Z daemon:mon.compute-0 [INFO] \"Reconfigured mon.compute-0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-03T21:09:59.658153Z", "memory_request": 2147483648, "memory_usage": 38503710, "pending_daemon_config": false, "ports": [], "service_name": "mon", "started": "2025-12-03T21:08:01.518643Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@mon.compute-0", "version": "20.2.0"}, {"container_id": "fbaf3a19f164", "container_image_digests": ["quay.io/ceph/ceph@sha256:4c65c801a8e5e5704934118b2c723e7233f2b5de8552bfc8f129dabe1fced0b1", "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86"], "container_image_id": "524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3", "container_image_name": "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86", "cpu_percentage": "1.99%", "created": "2025-12-03T21:09:09.359923Z", "daemon_id": "0", "daemon_name": "osd.0", "daemon_type": "osd", "events": ["2025-12-03T21:09:09.428911Z daemon:osd.0 [INFO] \"Deployed osd.0 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-03T21:09:59.658380Z", "memory_request": 4294967296, "memory_usage": 60345548, "pending_daemon_config": false, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-03T21:09:09.272542Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@osd.0", "version": "20.2.0"}, {"container_id": "947e483d8391", "container_image_digests": ["quay.io/ceph/ceph@sha256:4c65c801a8e5e5704934118b2c723e7233f2b5de8552bfc8f129dabe1fced0b1", "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86"], "container_image_id": "524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3", "container_image_name": "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86", "cpu_percentage": "2.10%", "created": "2025-12-03T21:09:14.077017Z", "daemon_id": "1", "daemon_name": "osd.1", "daemon_type": "osd", "events": ["2025-12-03T21:09:14.157357Z daemon:osd.1 [INFO] \"Deployed osd.1 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-03T21:09:59.658448Z", "memory_request": 4294967296, "memory_usage": 58395197, "pending_daemon_config": false, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-03T21:09:13.776336Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@osd.1", "version": "20.2.0"}, {"container_id": "f54ced40cf6e", "container_image_digests": ["quay.io/ceph/ceph@sha256:4c65c801a8e5e5704934118b2c723e7233f2b5de8552bfc8f129dabe1fced0b1", "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86"], "container_image_id": "524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3", "container_image_name": "quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86", "cpu_percentage": "2.16%", "created": "2025-12-03T21:09:18.678110Z", "daemon_id": "2", "daemon_name": "osd.2", "daemon_type": "osd", "events": ["2025-12-03T21:09:18.819614Z daemon:osd.2 [INFO] \"Deployed osd.2 on host 'compute-0'\""], "hostname": "compute-0", "is_active": false, "last_refresh": "2025-12-03T21:09:59.658518Z", "memory_request": 4294967296, "memory_usage": 56623104, "pending_daemon_config": false, "ports": [], "service_name": "osd.default_drive_group", "started": "2025-12-03T21:09:18.498669Z", "status": 1, "status_desc": "running", "systemd_unit": "ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a@osd.2", "version": "20.2.0"}]
Dec 03 21:10:04 compute-0 lvm[94800]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:10:04 compute-0 lvm[94800]: VG ceph_vg2 finished
Dec 03 21:10:04 compute-0 systemd[1]: libpod-7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6.scope: Deactivated successfully.
Dec 03 21:10:04 compute-0 podman[94685]: 2025-12-03 21:10:04.381867691 +0000 UTC m=+0.594324393 container died 7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6 (image=quay.io/ceph/ceph:v20, name=hopeful_shirley, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:04 compute-0 lvm[94803]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:10:04 compute-0 lvm[94803]: VG ceph_vg0 finished
Dec 03 21:10:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-208048ec0fab4deb0ae96b8055b23eeccd37d9f0a2cd07f1177ddd01f5650586-merged.mount: Deactivated successfully.
Dec 03 21:10:04 compute-0 podman[94685]: 2025-12-03 21:10:04.423811282 +0000 UTC m=+0.636267984 container remove 7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6 (image=quay.io/ceph/ceph:v20, name=hopeful_shirley, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3)
Dec 03 21:10:04 compute-0 systemd[1]: libpod-conmon-7d526ed195324b4734745766ba30f4301c23debeb295d9e949190f49085887e6.scope: Deactivated successfully.
Dec 03 21:10:04 compute-0 sudo[94679]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:04 compute-0 blissful_brattain[94677]: {}
Dec 03 21:10:04 compute-0 systemd[1]: libpod-dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb.scope: Deactivated successfully.
Dec 03 21:10:04 compute-0 systemd[1]: libpod-dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb.scope: Consumed 1.365s CPU time.
Dec 03 21:10:04 compute-0 podman[94637]: 2025-12-03 21:10:04.494492341 +0000 UTC m=+1.066837321 container died dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=blissful_brattain, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:10:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-4c8a86f9c9b1ea49d4ffcc6a7a2b5409aae37adc3bdd5cbbfc22897c8d3e070e-merged.mount: Deactivated successfully.
Dec 03 21:10:04 compute-0 podman[94637]: 2025-12-03 21:10:04.535697122 +0000 UTC m=+1.108042102 container remove dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=blissful_brattain, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:10:04 compute-0 systemd[1]: libpod-conmon-dfa17126c425ac3d24f7ab11616f11dfdc377876cb8443d7528368392904b2bb.scope: Deactivated successfully.
Dec 03 21:10:04 compute-0 sudo[94562]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:10:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:10:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:04 compute-0 sudo[94829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:10:04 compute-0 sudo[94829]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:04 compute-0 sudo[94829]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:04 compute-0 sudo[94854]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:10:04 compute-0 sudo[94854]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:04 compute-0 sudo[94854]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:04 compute-0 sudo[94879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:10:04 compute-0 sudo[94879]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v74: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:05 compute-0 ceph-mon[75204]: from='client.14250 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 03 21:10:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:05 compute-0 podman[94948]: 2025-12-03 21:10:05.238544885 +0000 UTC m=+0.063712664 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:10:05 compute-0 sudo[94991]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kobqvjnpunynxixpmxyymedlrugewmfc ; /usr/bin/python3'
Dec 03 21:10:05 compute-0 sudo[94991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:05 compute-0 podman[94948]: 2025-12-03 21:10:05.345963855 +0000 UTC m=+0.171131604 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:10:05 compute-0 python3[94993]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   -s -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:05 compute-0 podman[95031]: 2025-12-03 21:10:05.539818766 +0000 UTC m=+0.042999510 container create 892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec (image=quay.io/ceph/ceph:v20, name=busy_lovelace, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:05 compute-0 systemd[1]: Started libpod-conmon-892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec.scope.
Dec 03 21:10:05 compute-0 podman[95031]: 2025-12-03 21:10:05.521039764 +0000 UTC m=+0.024220578 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:10:05 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96f6e0234955d63cb0ee3111a47765b3e75c0e72414f6af508ce57629364b2ea/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96f6e0234955d63cb0ee3111a47765b3e75c0e72414f6af508ce57629364b2ea/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:05 compute-0 podman[95031]: 2025-12-03 21:10:05.642951252 +0000 UTC m=+0.146132086 container init 892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec (image=quay.io/ceph/ceph:v20, name=busy_lovelace, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:05 compute-0 podman[95031]: 2025-12-03 21:10:05.64962896 +0000 UTC m=+0.152809714 container start 892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec (image=quay.io/ceph/ceph:v20, name=busy_lovelace, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:10:05 compute-0 podman[95031]: 2025-12-03 21:10:05.65335977 +0000 UTC m=+0.156540604 container attach 892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec (image=quay.io/ceph/ceph:v20, name=busy_lovelace, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:10:06 compute-0 sudo[94879]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/902910606' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 03 21:10:06 compute-0 busy_lovelace[95064]: 
Dec 03 21:10:06 compute-0 busy_lovelace[95064]: {"fsid":"c21de27e-a7fd-594b-8324-0697ba9aab3a","health":{"status":"HEALTH_OK","checks":{},"mutes":[]},"election_epoch":5,"quorum":[0],"quorum_names":["compute-0"],"quorum_age":124,"monmap":{"epoch":1,"min_mon_release_name":"tentacle","num_mons":1},"osdmap":{"epoch":31,"num_osds":3,"num_up_osds":3,"osd_up_since":1764796165,"num_in_osds":3,"osd_in_since":1764796141,"num_remapped_pgs":0},"pgmap":{"pgs_by_state":[{"state_name":"active+clean","count":7}],"num_pgs":7,"num_pools":7,"num_objects":24,"data_bytes":461710,"bytes_used":83910656,"bytes_avail":64328015872,"bytes_total":64411926528,"write_bytes_sec":1194,"read_op_per_sec":0,"write_op_per_sec":3},"fsmap":{"epoch":5,"btime":"2025-12-03T21:09:59:713283+0000","id":1,"up":1,"in":1,"max":1,"by_rank":[{"filesystem_id":1,"rank":0,"name":"cephfs.compute-0.gzkqle","status":"up:active","gid":14242}],"up:standby":0},"mgrmap":{"available":true,"num_standbys":0,"modules":["cephadm","iostat","nfs"],"services":{}},"servicemap":{"epoch":2,"modified":"2025-12-03T21:09:23.137474+0000","services":{"osd":{"daemons":{"summary":"","0":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}},"1":{"start_epoch":0,"start_stamp":"0.000000","gid":0,"addr":"(unrecognized address family 0)/0","metadata":{},"task_status":{}}}}}},"progress_events":{}}
Dec 03 21:10:06 compute-0 systemd[1]: libpod-892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec.scope: Deactivated successfully.
Dec 03 21:10:06 compute-0 podman[95031]: 2025-12-03 21:10:06.213387787 +0000 UTC m=+0.716568531 container died 892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec (image=quay.io/ceph/ceph:v20, name=busy_lovelace, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: pgmap v74: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:10:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/902910606' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Dec 03 21:10:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-96f6e0234955d63cb0ee3111a47765b3e75c0e72414f6af508ce57629364b2ea-merged.mount: Deactivated successfully.
Dec 03 21:10:06 compute-0 podman[95031]: 2025-12-03 21:10:06.263489835 +0000 UTC m=+0.766670579 container remove 892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec (image=quay.io/ceph/ceph:v20, name=busy_lovelace, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:06 compute-0 sudo[95177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:10:06 compute-0 sudo[95177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:06 compute-0 systemd[1]: libpod-conmon-892914653d0ecde6f4e7ea3c291075f718b8686ab7104278f691359f55d8f5ec.scope: Deactivated successfully.
Dec 03 21:10:06 compute-0 sudo[95177]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:06 compute-0 sudo[94991]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:06 compute-0 sudo[95214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:10:06 compute-0 sudo[95214]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:06 compute-0 podman[95251]: 2025-12-03 21:10:06.648459713 +0000 UTC m=+0.048822096 container create 5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_hypatia, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030)
Dec 03 21:10:06 compute-0 systemd[1]: Started libpod-conmon-5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352.scope.
Dec 03 21:10:06 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:06 compute-0 podman[95251]: 2025-12-03 21:10:06.626407604 +0000 UTC m=+0.026769997 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:06 compute-0 podman[95251]: 2025-12-03 21:10:06.737101451 +0000 UTC m=+0.137464054 container init 5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_hypatia, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:10:06 compute-0 podman[95251]: 2025-12-03 21:10:06.744119519 +0000 UTC m=+0.144481862 container start 5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_hypatia, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default)
Dec 03 21:10:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:06 compute-0 podman[95251]: 2025-12-03 21:10:06.747816008 +0000 UTC m=+0.148178431 container attach 5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_hypatia, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.build-date=20251030)
Dec 03 21:10:06 compute-0 affectionate_hypatia[95267]: 167 167
Dec 03 21:10:06 compute-0 systemd[1]: libpod-5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352.scope: Deactivated successfully.
Dec 03 21:10:06 compute-0 podman[95251]: 2025-12-03 21:10:06.750299824 +0000 UTC m=+0.150662197 container died 5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_hypatia, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0)
Dec 03 21:10:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-3eff01d82869ee7924cf38b66159feb6dcfc2bdbb8f92e22e5b08456114af9d3-merged.mount: Deactivated successfully.
Dec 03 21:10:06 compute-0 podman[95251]: 2025-12-03 21:10:06.797771183 +0000 UTC m=+0.198133526 container remove 5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_hypatia, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:10:06 compute-0 systemd[1]: libpod-conmon-5a7c4c63c01735a747b71a8fe7c2c8c78649e227869680ddc35e7bf843452352.scope: Deactivated successfully.
Dec 03 21:10:06 compute-0 podman[95290]: 2025-12-03 21:10:06.962754102 +0000 UTC m=+0.053563242 container create 81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_khorana, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:10:07 compute-0 systemd[1]: Started libpod-conmon-81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e.scope.
Dec 03 21:10:07 compute-0 podman[95290]: 2025-12-03 21:10:06.932142094 +0000 UTC m=+0.022951324 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:07 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67bd12f2b4aae8216576c72e8ca5cfad23598a895e9414611775e30555ec15e1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67bd12f2b4aae8216576c72e8ca5cfad23598a895e9414611775e30555ec15e1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67bd12f2b4aae8216576c72e8ca5cfad23598a895e9414611775e30555ec15e1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67bd12f2b4aae8216576c72e8ca5cfad23598a895e9414611775e30555ec15e1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67bd12f2b4aae8216576c72e8ca5cfad23598a895e9414611775e30555ec15e1/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:07 compute-0 podman[95290]: 2025-12-03 21:10:07.06035696 +0000 UTC m=+0.151166140 container init 81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_khorana, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:07 compute-0 podman[95290]: 2025-12-03 21:10:07.078728281 +0000 UTC m=+0.169537431 container start 81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_khorana, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:07 compute-0 podman[95290]: 2025-12-03 21:10:07.083426357 +0000 UTC m=+0.174235527 container attach 81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_khorana, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:10:07 compute-0 sudo[95335]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgwqedywgzttgplesqpabdlrlizqtzvc ; /usr/bin/python3'
Dec 03 21:10:07 compute-0 sudo[95335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v75: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:07 compute-0 python3[95337]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   config dump -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:07 compute-0 podman[95338]: 2025-12-03 21:10:07.354245254 +0000 UTC m=+0.051942889 container create 7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251 (image=quay.io/ceph/ceph:v20, name=lucid_chaplygin, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default)
Dec 03 21:10:07 compute-0 systemd[1]: Started libpod-conmon-7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251.scope.
Dec 03 21:10:07 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/706499fe9869319bbf32c0a510d2a20af3962384e4201e8bb50e3e6e1061a4c8/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/706499fe9869319bbf32c0a510d2a20af3962384e4201e8bb50e3e6e1061a4c8/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:07 compute-0 podman[95338]: 2025-12-03 21:10:07.418717227 +0000 UTC m=+0.116414902 container init 7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251 (image=quay.io/ceph/ceph:v20, name=lucid_chaplygin, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:10:07 compute-0 podman[95338]: 2025-12-03 21:10:07.424989305 +0000 UTC m=+0.122686990 container start 7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251 (image=quay.io/ceph/ceph:v20, name=lucid_chaplygin, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:10:07 compute-0 podman[95338]: 2025-12-03 21:10:07.335640587 +0000 UTC m=+0.033338232 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:10:07 compute-0 podman[95338]: 2025-12-03 21:10:07.428921799 +0000 UTC m=+0.126619494 container attach 7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251 (image=quay.io/ceph/ceph:v20, name=lucid_chaplygin, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:10:07 compute-0 frosty_khorana[95307]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:10:07 compute-0 frosty_khorana[95307]: --> All data devices are unavailable
Dec 03 21:10:07 compute-0 systemd[1]: libpod-81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e.scope: Deactivated successfully.
Dec 03 21:10:07 compute-0 podman[95290]: 2025-12-03 21:10:07.613099161 +0000 UTC m=+0.703908291 container died 81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_khorana, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:10:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-67bd12f2b4aae8216576c72e8ca5cfad23598a895e9414611775e30555ec15e1-merged.mount: Deactivated successfully.
Dec 03 21:10:07 compute-0 podman[95290]: 2025-12-03 21:10:07.655443673 +0000 UTC m=+0.746252803 container remove 81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_khorana, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 03 21:10:07 compute-0 systemd[1]: libpod-conmon-81596905cca1fd04a2bbc2e1a7fa7fc3ce8a039c448ba1f6c809543e59841d3e.scope: Deactivated successfully.
Dec 03 21:10:07 compute-0 sudo[95214]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:07 compute-0 sudo[95403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:10:07 compute-0 sudo[95403]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:07 compute-0 sudo[95403]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json"} v 0)
Dec 03 21:10:07 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2874896013' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:10:07 compute-0 sudo[95428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:10:07 compute-0 lucid_chaplygin[95357]: 
Dec 03 21:10:07 compute-0 sudo[95428]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:07 compute-0 lucid_chaplygin[95357]: [{"section":"global","name":"cluster_network","value":"172.20.0.0/24","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"global","name":"container_image","value":"quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86","level":"basic","can_update_at_runtime":false,"mask":""},{"section":"global","name":"log_to_file","value":"true","level":"basic","can_update_at_runtime":true,"mask":""},{"section":"global","name":"mon_cluster_log_to_file","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"ms_bind_ipv4","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"ms_bind_ipv6","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"osd_pool_default_size","value":"1","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"global","name":"public_network","value":"192.168.122.0/24","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mon","name":"auth_allow_insecure_global_id_reclaim","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mon","name":"mon_warn_on_pool_no_redundancy","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mgr","name":"mgr/cephadm/container_init","value":"True","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/cephadm/migration_current","value":"7","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/cephadm/use_repo_digest","value":"false","level":"advanced","can_update_at_runtime":false,"mask":""},{"section":"mgr","name":"mgr/orchestrator/orchestrator","value":"cephadm","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mgr","name":"mgr_standby_modules","value":"false","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"osd","name":"osd_memory_target_autotune","value":"true","level":"advanced","can_update_at_runtime":true,"mask":""},{"section":"mds.cephfs","name":"mds_join_fs","value":"cephfs","level":"basic","can_update_at_runtime":true,"mask":""}]
Dec 03 21:10:07 compute-0 systemd[1]: libpod-7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251.scope: Deactivated successfully.
Dec 03 21:10:07 compute-0 podman[95338]: 2025-12-03 21:10:07.812220523 +0000 UTC m=+0.509918188 container died 7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251 (image=quay.io/ceph/ceph:v20, name=lucid_chaplygin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-706499fe9869319bbf32c0a510d2a20af3962384e4201e8bb50e3e6e1061a4c8-merged.mount: Deactivated successfully.
Dec 03 21:10:07 compute-0 podman[95338]: 2025-12-03 21:10:07.855338586 +0000 UTC m=+0.553036241 container remove 7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251 (image=quay.io/ceph/ceph:v20, name=lucid_chaplygin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:07 compute-0 systemd[1]: libpod-conmon-7fa5aefafafd689fd809499a678e308dd1b2f0a363d48c00a873141d480c0251.scope: Deactivated successfully.
Dec 03 21:10:07 compute-0 sudo[95335]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:08 compute-0 podman[95480]: 2025-12-03 21:10:08.094184608 +0000 UTC m=+0.045954660 container create 27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_mendel, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:10:08 compute-0 systemd[1]: Started libpod-conmon-27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423.scope.
Dec 03 21:10:08 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:08 compute-0 podman[95480]: 2025-12-03 21:10:08.074737118 +0000 UTC m=+0.026507200 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:08 compute-0 podman[95480]: 2025-12-03 21:10:08.182169599 +0000 UTC m=+0.133939681 container init 27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_mendel, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3)
Dec 03 21:10:08 compute-0 podman[95480]: 2025-12-03 21:10:08.193405719 +0000 UTC m=+0.145175771 container start 27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_mendel, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 03 21:10:08 compute-0 podman[95480]: 2025-12-03 21:10:08.19714017 +0000 UTC m=+0.148910252 container attach 27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_mendel, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:10:08 compute-0 laughing_mendel[95496]: 167 167
Dec 03 21:10:08 compute-0 systemd[1]: libpod-27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423.scope: Deactivated successfully.
Dec 03 21:10:08 compute-0 podman[95480]: 2025-12-03 21:10:08.199859192 +0000 UTC m=+0.151629274 container died 27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_mendel, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-8fdb3724f8c0b27f64a6475d1b2666ec8265e833f841d5fec5e05ebf84928608-merged.mount: Deactivated successfully.
Dec 03 21:10:08 compute-0 ceph-mon[75204]: pgmap v75: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:08 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2874896013' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Dec 03 21:10:08 compute-0 podman[95480]: 2025-12-03 21:10:08.252858559 +0000 UTC m=+0.204628621 container remove 27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_mendel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:08 compute-0 systemd[1]: libpod-conmon-27dc558dee46a242ca80d16aced2a3d9a3dfffcd73d5cd2407120cd64208e423.scope: Deactivated successfully.
Dec 03 21:10:08 compute-0 podman[95521]: 2025-12-03 21:10:08.467292499 +0000 UTC m=+0.059347737 container create ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_matsumoto, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:08 compute-0 systemd[1]: Started libpod-conmon-ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63.scope.
Dec 03 21:10:08 compute-0 podman[95521]: 2025-12-03 21:10:08.441652474 +0000 UTC m=+0.033707742 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:08 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cc9e8a2d31ffe6e30f49b67f11570888b31b2f80a1171b34f34307587bb2a55/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cc9e8a2d31ffe6e30f49b67f11570888b31b2f80a1171b34f34307587bb2a55/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cc9e8a2d31ffe6e30f49b67f11570888b31b2f80a1171b34f34307587bb2a55/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cc9e8a2d31ffe6e30f49b67f11570888b31b2f80a1171b34f34307587bb2a55/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:08 compute-0 podman[95521]: 2025-12-03 21:10:08.581367638 +0000 UTC m=+0.173422946 container init ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_matsumoto, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:10:08 compute-0 podman[95521]: 2025-12-03 21:10:08.597979912 +0000 UTC m=+0.190035180 container start ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_matsumoto, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:08 compute-0 podman[95521]: 2025-12-03 21:10:08.601919556 +0000 UTC m=+0.193974884 container attach ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_matsumoto, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:10:08 compute-0 sudo[95565]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euofkgsumweyqxtkyfqsqdcocvhfijkf ; /usr/bin/python3'
Dec 03 21:10:08 compute-0 sudo[95565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]: {
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:     "0": [
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:         {
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "devices": [
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "/dev/loop3"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             ],
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_name": "ceph_lv0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_size": "21470642176",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "name": "ceph_lv0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "tags": {
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cluster_name": "ceph",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.crush_device_class": "",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.encrypted": "0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.objectstore": "bluestore",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osd_id": "0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.type": "block",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.vdo": "0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.with_tpm": "0"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             },
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "type": "block",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "vg_name": "ceph_vg0"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:         }
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:     ],
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:     "1": [
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:         {
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "devices": [
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "/dev/loop4"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             ],
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_name": "ceph_lv1",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_size": "21470642176",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "name": "ceph_lv1",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "tags": {
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cluster_name": "ceph",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.crush_device_class": "",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.encrypted": "0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.objectstore": "bluestore",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osd_id": "1",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.type": "block",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.vdo": "0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.with_tpm": "0"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             },
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "type": "block",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "vg_name": "ceph_vg1"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:         }
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:     ],
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:     "2": [
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:         {
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "devices": [
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "/dev/loop5"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             ],
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_name": "ceph_lv2",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_size": "21470642176",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "name": "ceph_lv2",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "tags": {
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.cluster_name": "ceph",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.crush_device_class": "",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.encrypted": "0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.objectstore": "bluestore",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osd_id": "2",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.type": "block",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.vdo": "0",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:                 "ceph.with_tpm": "0"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             },
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "type": "block",
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:             "vg_name": "ceph_vg2"
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:         }
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]:     ]
Dec 03 21:10:08 compute-0 quizzical_matsumoto[95537]: }
Dec 03 21:10:08 compute-0 systemd[1]: libpod-ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63.scope: Deactivated successfully.
Dec 03 21:10:08 compute-0 podman[95521]: 2025-12-03 21:10:08.944735878 +0000 UTC m=+0.536791176 container died ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_matsumoto, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:08 compute-0 python3[95569]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   osd get-require-min-compat-client _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-7cc9e8a2d31ffe6e30f49b67f11570888b31b2f80a1171b34f34307587bb2a55-merged.mount: Deactivated successfully.
Dec 03 21:10:09 compute-0 podman[95521]: 2025-12-03 21:10:09.008846851 +0000 UTC m=+0.600902079 container remove ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_matsumoto, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:10:09 compute-0 systemd[1]: libpod-conmon-ade57dacec19039c2e468ae140c208dd15ed56bbab80f74f7b8bcbfff43bcc63.scope: Deactivated successfully.
Dec 03 21:10:09 compute-0 sudo[95428]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:09 compute-0 podman[95579]: 2025-12-03 21:10:09.047476743 +0000 UTC m=+0.051313812 container create 44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222 (image=quay.io/ceph/ceph:v20, name=admiring_wu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:09 compute-0 systemd[1]: Started libpod-conmon-44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222.scope.
Dec 03 21:10:09 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:09 compute-0 sudo[95600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:10:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ef03a25828328613f77757efda3eff99697b4e89e1e65b78aa6c6fb0c253163/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ef03a25828328613f77757efda3eff99697b4e89e1e65b78aa6c6fb0c253163/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:09 compute-0 sudo[95600]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:09 compute-0 sudo[95600]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:09 compute-0 podman[95579]: 2025-12-03 21:10:09.118241904 +0000 UTC m=+0.122079003 container init 44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222 (image=quay.io/ceph/ceph:v20, name=admiring_wu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:09 compute-0 podman[95579]: 2025-12-03 21:10:09.02488595 +0000 UTC m=+0.028723099 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:10:09 compute-0 podman[95579]: 2025-12-03 21:10:09.125341534 +0000 UTC m=+0.129178603 container start 44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222 (image=quay.io/ceph/ceph:v20, name=admiring_wu, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:10:09 compute-0 podman[95579]: 2025-12-03 21:10:09.128811887 +0000 UTC m=+0.132649036 container attach 44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222 (image=quay.io/ceph/ceph:v20, name=admiring_wu, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:10:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v76: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:09 compute-0 sudo[95630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:10:09 compute-0 sudo[95630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:09 compute-0 podman[95687]: 2025-12-03 21:10:09.427870529 +0000 UTC m=+0.036639880 container create 835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_cartwright, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 03 21:10:09 compute-0 systemd[1]: Started libpod-conmon-835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509.scope.
Dec 03 21:10:09 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:09 compute-0 podman[95687]: 2025-12-03 21:10:09.499296787 +0000 UTC m=+0.108066218 container init 835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_cartwright, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:10:09 compute-0 podman[95687]: 2025-12-03 21:10:09.505829853 +0000 UTC m=+0.114599224 container start 835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_cartwright, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030)
Dec 03 21:10:09 compute-0 podman[95687]: 2025-12-03 21:10:09.411630885 +0000 UTC m=+0.020400266 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:09 compute-0 funny_cartwright[95704]: 167 167
Dec 03 21:10:09 compute-0 systemd[1]: libpod-835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509.scope: Deactivated successfully.
Dec 03 21:10:09 compute-0 podman[95687]: 2025-12-03 21:10:09.511380421 +0000 UTC m=+0.120149772 container attach 835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_cartwright, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:10:09 compute-0 podman[95687]: 2025-12-03 21:10:09.51172172 +0000 UTC m=+0.120491081 container died 835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_cartwright, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-6eba33053ae3666065742016c25cacf9b6ba836a6b9ee652510fd282dc06c835-merged.mount: Deactivated successfully.
Dec 03 21:10:09 compute-0 podman[95687]: 2025-12-03 21:10:09.551678697 +0000 UTC m=+0.160448068 container remove 835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_cartwright, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030)
Dec 03 21:10:09 compute-0 systemd[1]: libpod-conmon-835a9b849050b7a531406c1bfd9d1b519051960c8588ee0ff1f5a6a84b71f509.scope: Deactivated successfully.
Dec 03 21:10:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd get-require-min-compat-client"} v 0)
Dec 03 21:10:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1387891732' entity='client.admin' cmd={"prefix": "osd get-require-min-compat-client"} : dispatch
Dec 03 21:10:09 compute-0 admiring_wu[95625]: mimic
Dec 03 21:10:09 compute-0 systemd[1]: libpod-44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222.scope: Deactivated successfully.
Dec 03 21:10:09 compute-0 podman[95579]: 2025-12-03 21:10:09.609824512 +0000 UTC m=+0.613661591 container died 44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222 (image=quay.io/ceph/ceph:v20, name=admiring_wu, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:10:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-0ef03a25828328613f77757efda3eff99697b4e89e1e65b78aa6c6fb0c253163-merged.mount: Deactivated successfully.
Dec 03 21:10:09 compute-0 podman[95579]: 2025-12-03 21:10:09.658780629 +0000 UTC m=+0.662617688 container remove 44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222 (image=quay.io/ceph/ceph:v20, name=admiring_wu, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:10:09 compute-0 systemd[1]: libpod-conmon-44689a57d39df38cec03bdec4d8fa8ba5618c933bf1a95f3cd006a080590a222.scope: Deactivated successfully.
Dec 03 21:10:09 compute-0 sudo[95565]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:09 compute-0 podman[95743]: 2025-12-03 21:10:09.742206149 +0000 UTC m=+0.045888537 container create 7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_grothendieck, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:10:09 compute-0 systemd[1]: Started libpod-conmon-7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8.scope.
Dec 03 21:10:09 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13772b37e426b7351eaa497578fb44c79c9f10926652252025c4b257ff648830/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13772b37e426b7351eaa497578fb44c79c9f10926652252025c4b257ff648830/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13772b37e426b7351eaa497578fb44c79c9f10926652252025c4b257ff648830/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13772b37e426b7351eaa497578fb44c79c9f10926652252025c4b257ff648830/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:09 compute-0 podman[95743]: 2025-12-03 21:10:09.724064695 +0000 UTC m=+0.027747133 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:10:09 compute-0 podman[95743]: 2025-12-03 21:10:09.831231728 +0000 UTC m=+0.134914136 container init 7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_grothendieck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:10:09 compute-0 podman[95743]: 2025-12-03 21:10:09.841225006 +0000 UTC m=+0.144907434 container start 7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_grothendieck, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS)
Dec 03 21:10:09 compute-0 podman[95743]: 2025-12-03 21:10:09.845947921 +0000 UTC m=+0.149630329 container attach 7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_grothendieck, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:10 compute-0 ceph-mon[75204]: pgmap v76: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:10 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1387891732' entity='client.admin' cmd={"prefix": "osd get-require-min-compat-client"} : dispatch
Dec 03 21:10:10 compute-0 lvm[95862]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:10:10 compute-0 lvm[95862]: VG ceph_vg1 finished
Dec 03 21:10:10 compute-0 sudo[95857]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsjgsywffkqorwtrpfmliytdjzynxrij ; /usr/bin/python3'
Dec 03 21:10:10 compute-0 sudo[95857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:10 compute-0 lvm[95861]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:10:10 compute-0 lvm[95861]: VG ceph_vg0 finished
Dec 03 21:10:10 compute-0 lvm[95866]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:10:10 compute-0 lvm[95866]: VG ceph_vg2 finished
Dec 03 21:10:10 compute-0 mystifying_grothendieck[95759]: {}
Dec 03 21:10:10 compute-0 python3[95865]: ansible-ansible.legacy.command Invoked with _raw_params=podman run --rm --net=host --ipc=host   --volume /etc/ceph:/etc/ceph:z --volume /home/ceph-admin/assimilate_ceph.conf:/home/assimilate_ceph.conf:z    --entrypoint ceph quay.io/ceph/ceph:v20 --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring   versions -f json _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:10 compute-0 systemd[1]: libpod-7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8.scope: Deactivated successfully.
Dec 03 21:10:10 compute-0 systemd[1]: libpod-7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8.scope: Consumed 1.353s CPU time.
Dec 03 21:10:10 compute-0 podman[95869]: 2025-12-03 21:10:10.771738979 +0000 UTC m=+0.051504408 container create e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7 (image=quay.io/ceph/ceph:v20, name=flamboyant_fermat, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:10 compute-0 podman[95875]: 2025-12-03 21:10:10.784417247 +0000 UTC m=+0.045354322 container died 7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_grothendieck, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:10 compute-0 systemd[1]: Started libpod-conmon-e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7.scope.
Dec 03 21:10:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-13772b37e426b7351eaa497578fb44c79c9f10926652252025c4b257ff648830-merged.mount: Deactivated successfully.
Dec 03 21:10:10 compute-0 podman[95875]: 2025-12-03 21:10:10.834606598 +0000 UTC m=+0.095543643 container remove 7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=mystifying_grothendieck, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:10 compute-0 podman[95869]: 2025-12-03 21:10:10.754041175 +0000 UTC m=+0.033806604 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph:v20
Dec 03 21:10:10 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:10:10 compute-0 systemd[1]: libpod-conmon-7b01378c563676f5651f29c0979aa1f365c62a3fb550d22bc1fbd5d9e53884e8.scope: Deactivated successfully.
Dec 03 21:10:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8084179a9d395d88cbab574af8b511bca26964c7c3601ee53ffb43a1dfa7d4a5/merged/etc/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8084179a9d395d88cbab574af8b511bca26964c7c3601ee53ffb43a1dfa7d4a5/merged/home/assimilate_ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:10:10 compute-0 podman[95869]: 2025-12-03 21:10:10.867165789 +0000 UTC m=+0.146931218 container init e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7 (image=quay.io/ceph/ceph:v20, name=flamboyant_fermat, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:10:10 compute-0 podman[95869]: 2025-12-03 21:10:10.874671649 +0000 UTC m=+0.154437048 container start e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7 (image=quay.io/ceph/ceph:v20, name=flamboyant_fermat, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:10:10 compute-0 podman[95869]: 2025-12-03 21:10:10.877762082 +0000 UTC m=+0.157527491 container attach e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7 (image=quay.io/ceph/ceph:v20, name=flamboyant_fermat, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:10:10 compute-0 sudo[95630]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:10:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:10:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:10 compute-0 sudo[95903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:10:10 compute-0 sudo[95903]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:10:10 compute-0 sudo[95903]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v77: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "versions", "format": "json"} v 0)
Dec 03 21:10:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3023222579' entity='client.admin' cmd={"prefix": "versions", "format": "json"} : dispatch
Dec 03 21:10:11 compute-0 flamboyant_fermat[95899]: 
Dec 03 21:10:11 compute-0 flamboyant_fermat[95899]: {"mon":{"ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo)":1},"mgr":{"ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo)":1},"osd":{"ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo)":3},"mds":{"ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo)":1},"overall":{"ceph version 20.2.0 (69f84cc2651aa259a15bc192ddaabd3baba07489) tentacle (stable - RelWithDebInfo)":6}}
Dec 03 21:10:11 compute-0 systemd[1]: libpod-e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7.scope: Deactivated successfully.
Dec 03 21:10:11 compute-0 podman[95869]: 2025-12-03 21:10:11.428019574 +0000 UTC m=+0.707785003 container died e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7 (image=quay.io/ceph/ceph:v20, name=flamboyant_fermat, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:10:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-8084179a9d395d88cbab574af8b511bca26964c7c3601ee53ffb43a1dfa7d4a5-merged.mount: Deactivated successfully.
Dec 03 21:10:11 compute-0 podman[95869]: 2025-12-03 21:10:11.478920074 +0000 UTC m=+0.758685503 container remove e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7 (image=quay.io/ceph/ceph:v20, name=flamboyant_fermat, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:10:11 compute-0 systemd[1]: libpod-conmon-e878c72c10358f82b7658245fdc47eb21e5b579574efac2e3b1bea7a27e5b1b7.scope: Deactivated successfully.
Dec 03 21:10:11 compute-0 sudo[95857]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:11 compute-0 ceph-mon[75204]: pgmap v77: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s wr, 3 op/s
Dec 03 21:10:11 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3023222579' entity='client.admin' cmd={"prefix": "versions", "format": "json"} : dispatch
Dec 03 21:10:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v78: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:14 compute-0 ceph-mon[75204]: pgmap v78: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v79: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:16 compute-0 ceph-mon[75204]: pgmap v79: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v80: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:18 compute-0 ceph-mon[75204]: pgmap v80: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v81: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:20 compute-0 ceph-mon[75204]: pgmap v81: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v82: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:10:21
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['vms', 'images', '.mgr', 'backups', 'volumes', 'cephfs.cephfs.meta', 'cephfs.cephfs.data']
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:10:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 6.103683295811636e-07 of space, bias 4.0, pg target 0.0007324419954973963 quantized to 16 (current 1)
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 1)
Dec 03 21:10:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"} v 0)
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:10:21 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:10:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:10:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e31 do_prune osdmap full prune enabled
Dec 03 21:10:22 compute-0 ceph-mon[75204]: pgmap v82: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:22 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e32 e32: 3 total, 3 up, 3 in
Dec 03 21:10:22 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e32: 3 total, 3 up, 3 in
Dec 03 21:10:22 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev be9f06e2-0988-44e0-944c-a32f4623db2f (PG autoscaler increasing pool 2 PGs from 1 to 32)
Dec 03 21:10:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"} v 0)
Dec 03 21:10:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v84: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"} v 0)
Dec 03 21:10:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e32 do_prune osdmap full prune enabled
Dec 03 21:10:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e33 e33: 3 total, 3 up, 3 in
Dec 03 21:10:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:23 compute-0 ceph-mon[75204]: osdmap e32: 3 total, 3 up, 3 in
Dec 03 21:10:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:23 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 33 pg[2.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=33 pruub=15.815390587s) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active pruub 80.124893188s@ mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:23 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 33 pg[2.0( empty local-lis/les=16/17 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=33 pruub=15.815390587s) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown pruub 80.124893188s@ mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:23 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e33: 3 total, 3 up, 3 in
Dec 03 21:10:23 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev 65be1077-8506-40cf-b9c2-7381cbe8578d (PG autoscaler increasing pool 3 PGs from 1 to 32)
Dec 03 21:10:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"} v 0)
Dec 03 21:10:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e33 do_prune osdmap full prune enabled
Dec 03 21:10:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e34 e34: 3 total, 3 up, 3 in
Dec 03 21:10:24 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e34: 3 total, 3 up, 3 in
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1d( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1e( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1c( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.b( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.a( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.9( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1f( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.8( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.6( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.5( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.4( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.3( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.2( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.7( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.c( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.d( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.e( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.f( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.10( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.12( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.14( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.15( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.13( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev 164aeeb9-7441-4777-9204-3b80e724f485 (PG autoscaler increasing pool 4 PGs from 1 to 32)
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.11( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.16( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.17( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.18( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.19( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1a( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1b( empty local-lis/les=16/17 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1e( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"} v 0)
Dec 03 21:10:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:24 compute-0 ceph-mon[75204]: pgmap v84: 7 pgs: 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:24 compute-0 ceph-mon[75204]: osdmap e33: 3 total, 3 up, 3 in
Dec 03 21:10:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.e( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.0( empty local-lis/les=33/34 n=0 ec=16/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.10( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.12( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.14( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:24 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 34 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=16/16 les/c/f=17/17/0 sis=33) [2] r=0 lpr=33 pi=[16,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v87: 38 pgs: 31 unknown, 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"} v 0)
Dec 03 21:10:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"} v 0)
Dec 03 21:10:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e34 do_prune osdmap full prune enabled
Dec 03 21:10:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e35 e35: 3 total, 3 up, 3 in
Dec 03 21:10:25 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e35: 3 total, 3 up, 3 in
Dec 03 21:10:25 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev 202c5c36-38ff-46f8-88bf-13132659b89d (PG autoscaler increasing pool 5 PGs from 1 to 32)
Dec 03 21:10:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"} v 0)
Dec 03 21:10:25 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"} : dispatch
Dec 03 21:10:25 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:25 compute-0 ceph-mon[75204]: osdmap e34: 3 total, 3 up, 3 in
Dec 03 21:10:25 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:25 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:25 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:25 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:25 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:25 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:25 compute-0 ceph-mon[75204]: osdmap e35: 3 total, 3 up, 3 in
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 35 pg[3.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=35 pruub=13.945694923s) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active pruub 85.778251648s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 35 pg[3.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=35 pruub=13.945694923s) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown pruub 85.778251648s@ mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e35 do_prune osdmap full prune enabled
Dec 03 21:10:26 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Dec 03 21:10:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e36 e36: 3 total, 3 up, 3 in
Dec 03 21:10:26 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e36: 3 total, 3 up, 3 in
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1b( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1d( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1c( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1f( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1e( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.a( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.9( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.8( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.7( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.6( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.5( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.3( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.4( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.2( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.b( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.c( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.d( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.e( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev 5e903187-e192-4410-a195-d740445e0e64 (PG autoscaler increasing pool 6 PGs from 1 to 16)
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.f( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.10( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.11( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.12( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"} v 0)
Dec 03 21:10:26 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.13( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.14( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.15( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.16( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.17( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.18( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.19( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1a( empty local-lis/les=17/18 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.0( empty local-lis/les=35/36 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.2( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.4( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.10( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.13( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.14( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.19( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.1a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 36 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=17/17 les/c/f=18/18/0 sis=35) [1] r=0 lpr=35 pi=[17,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:26 compute-0 ceph-mon[75204]: pgmap v87: 38 pgs: 31 unknown, 7 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:26 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"} : dispatch
Dec 03 21:10:26 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Dec 03 21:10:26 compute-0 ceph-mon[75204]: osdmap e36: 3 total, 3 up, 3 in
Dec 03 21:10:26 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"} : dispatch
Dec 03 21:10:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:26 compute-0 ceph-mgr[75500]: [progress WARNING root] Starting Global Recovery Event,62 pgs not in active + clean state
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 35 pg[4.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=35 pruub=14.451667786s) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active pruub 91.378295898s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=35 pruub=14.451667786s) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown pruub 91.378295898s@ mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.3( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.2( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.5( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.7( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.4( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.6( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.9( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.8( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.b( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.a( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.d( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.c( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.16( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.18( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.17( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.1a( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.19( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.1c( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.1b( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.1e( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.1d( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.1f( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.f( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.e( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.11( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.14( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.13( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.15( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.10( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.12( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:26 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 36 pg[4.1( empty local-lis/les=18/19 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v90: 100 pgs: 62 unknown, 38 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"} v 0)
Dec 03 21:10:27 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"} : dispatch
Dec 03 21:10:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"} v 0)
Dec 03 21:10:27 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:27 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Dec 03 21:10:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e36 do_prune osdmap full prune enabled
Dec 03 21:10:27 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Dec 03 21:10:27 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:27 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Dec 03 21:10:27 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e37 e37: 3 total, 3 up, 3 in
Dec 03 21:10:27 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e37: 3 total, 3 up, 3 in
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] update: starting ev 98b7555f-24a6-4dd0-9e93-6c9c469015a1 (PG autoscaler increasing pool 7 PGs from 1 to 32)
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev be9f06e2-0988-44e0-944c-a32f4623db2f (PG autoscaler increasing pool 2 PGs from 1 to 32)
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event be9f06e2-0988-44e0-944c-a32f4623db2f (PG autoscaler increasing pool 2 PGs from 1 to 32) in 5 seconds
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev 65be1077-8506-40cf-b9c2-7381cbe8578d (PG autoscaler increasing pool 3 PGs from 1 to 32)
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event 65be1077-8506-40cf-b9c2-7381cbe8578d (PG autoscaler increasing pool 3 PGs from 1 to 32) in 4 seconds
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev 164aeeb9-7441-4777-9204-3b80e724f485 (PG autoscaler increasing pool 4 PGs from 1 to 32)
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event 164aeeb9-7441-4777-9204-3b80e724f485 (PG autoscaler increasing pool 4 PGs from 1 to 32) in 3 seconds
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev 202c5c36-38ff-46f8-88bf-13132659b89d (PG autoscaler increasing pool 5 PGs from 1 to 32)
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event 202c5c36-38ff-46f8-88bf-13132659b89d (PG autoscaler increasing pool 5 PGs from 1 to 32) in 2 seconds
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev 5e903187-e192-4410-a195-d740445e0e64 (PG autoscaler increasing pool 6 PGs from 1 to 16)
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event 5e903187-e192-4410-a195-d740445e0e64 (PG autoscaler increasing pool 6 PGs from 1 to 16) in 1 seconds
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] complete: finished ev 98b7555f-24a6-4dd0-9e93-6c9c469015a1 (PG autoscaler increasing pool 7 PGs from 1 to 32)
Dec 03 21:10:27 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event 98b7555f-24a6-4dd0-9e93-6c9c469015a1 (PG autoscaler increasing pool 7 PGs from 1 to 32) in 0 seconds
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[6.0( v 31'39 (0'0,31'39] local-lis/les=21/22 n=22 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=37 pruub=8.843221664s) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 31'38 mlcod 31'38 active pruub 86.411857605s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.1f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.1d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.1e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[6.0( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=37 pruub=8.843221664s) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 31'38 mlcod 0'0 unknown pruub 86.411857605s@ mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:27 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"} : dispatch
Dec 03 21:10:27 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:27 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec 03 21:10:27 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Dec 03 21:10:27 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:27 compute-0 ceph-mon[75204]: osdmap e37: 3 total, 3 up, 3 in
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.6( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.19( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.0( empty local-lis/les=35/37 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.3( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.15( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.17( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 37 pg[4.16( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=18/18 les/c/f=19/19/0 sis=35) [0] r=0 lpr=35 pi=[18,35)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:27 compute-0 sshd-session[95960]: Accepted publickey for zuul from 192.168.122.30 port 38176 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:10:27 compute-0 systemd-logind[787]: New session 33 of user zuul.
Dec 03 21:10:27 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 37 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=37 pruub=14.622339249s) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active pruub 83.146293640s@ mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:27 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 37 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=37 pruub=14.622339249s) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown pruub 83.146293640s@ mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:27 compute-0 systemd[1]: Started Session 33 of User zuul.
Dec 03 21:10:27 compute-0 sshd-session[95960]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:10:28 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Dec 03 21:10:28 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Dec 03 21:10:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e37 do_prune osdmap full prune enabled
Dec 03 21:10:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e38 e38: 3 total, 3 up, 3 in
Dec 03 21:10:28 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e38: 3 total, 3 up, 3 in
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.10( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.12( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.11( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.13( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.14( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.15( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=21/22 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.17( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.16( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=21/22 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.8( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.9( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.7( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-mon[75204]: pgmap v90: 100 pgs: 62 unknown, 38 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:28 compute-0 ceph-mon[75204]: 2.1d scrub starts
Dec 03 21:10:28 compute-0 ceph-mon[75204]: 2.1d scrub ok
Dec 03 21:10:28 compute-0 ceph-mon[75204]: osdmap e38: 3 total, 3 up, 3 in
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.10( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.17( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.8( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.0( empty local-lis/les=37/38 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.0( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 31'38 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 38 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=21/21 les/c/f=22/22/0 sis=37) [0] r=0 lpr=37 pi=[21,37)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:28 compute-0 python3.9[96113]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:10:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v93: 146 pgs: 108 unknown, 38 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"} v 0)
Dec 03 21:10:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:29 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Dec 03 21:10:29 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Dec 03 21:10:29 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Dec 03 21:10:29 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Dec 03 21:10:29 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Dec 03 21:10:29 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Dec 03 21:10:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e38 do_prune osdmap full prune enabled
Dec 03 21:10:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e39 e39: 3 total, 3 up, 3 in
Dec 03 21:10:29 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e39: 3 total, 3 up, 3 in
Dec 03 21:10:29 compute-0 ceph-mon[75204]: 4.1f scrub starts
Dec 03 21:10:29 compute-0 ceph-mon[75204]: 4.1f scrub ok
Dec 03 21:10:29 compute-0 ceph-mon[75204]: pgmap v93: 146 pgs: 108 unknown, 38 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"} : dispatch
Dec 03 21:10:29 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 39 pg[7.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39 pruub=8.368938446s) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active pruub 83.835609436s@ mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [1], acting_primary 1 -> 1, up_primary 1 -> 1, role 0 -> 0, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:29 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 39 pg[7.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39 pruub=8.368938446s) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown pruub 83.835609436s@ mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec 03 21:10:30 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec 03 21:10:30 compute-0 sudo[96329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbgjzoymugdycxzcszopidgfylstuomw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796230.0750785-32-198431211268273/AnsiballZ_command.py'
Dec 03 21:10:30 compute-0 sudo[96329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e39 do_prune osdmap full prune enabled
Dec 03 21:10:30 compute-0 ceph-mon[75204]: 4.1d scrub starts
Dec 03 21:10:30 compute-0 ceph-mon[75204]: 4.1d scrub ok
Dec 03 21:10:30 compute-0 ceph-mon[75204]: 3.1e scrub starts
Dec 03 21:10:30 compute-0 ceph-mon[75204]: 2.1e scrub starts
Dec 03 21:10:30 compute-0 ceph-mon[75204]: 3.1e scrub ok
Dec 03 21:10:30 compute-0 ceph-mon[75204]: 2.1e scrub ok
Dec 03 21:10:30 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec 03 21:10:30 compute-0 ceph-mon[75204]: osdmap e39: 3 total, 3 up, 3 in
Dec 03 21:10:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e40 e40: 3 total, 3 up, 3 in
Dec 03 21:10:30 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e40: 3 total, 3 up, 3 in
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1e( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1d( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1c( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.11( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.13( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.16( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.10( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.17( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.15( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.14( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.a( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.b( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.9( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.8( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.f( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.6( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.4( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.5( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.7( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.2( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.3( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.c( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.d( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.e( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1f( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.18( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.19( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1a( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1b( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.12( empty local-lis/les=23/24 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.10( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.16( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.17( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.14( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.0( empty local-lis/les=39/40 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.7( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.d( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.19( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 40 pg[7.12( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=23/23 les/c/f=24/24/0 sis=39) [1] r=0 lpr=39 pi=[23,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:30 compute-0 python3.9[96331]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v96: 177 pgs: 1 peering, 31 unknown, 145 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:31 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Dec 03 21:10:31 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Dec 03 21:10:31 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec 03 21:10:31 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec 03 21:10:31 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.b scrub starts
Dec 03 21:10:31 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.b scrub ok
Dec 03 21:10:31 compute-0 ceph-mon[75204]: 3.1f scrub starts
Dec 03 21:10:31 compute-0 ceph-mon[75204]: 3.1f scrub ok
Dec 03 21:10:31 compute-0 ceph-mon[75204]: osdmap e40: 3 total, 3 up, 3 in
Dec 03 21:10:31 compute-0 ceph-mon[75204]: pgmap v96: 177 pgs: 1 peering, 31 unknown, 145 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:31 compute-0 ceph-mgr[75500]: [progress INFO root] Writing back 10 completed events
Dec 03 21:10:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 03 21:10:31 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:32 compute-0 ceph-mon[75204]: 4.1e scrub starts
Dec 03 21:10:32 compute-0 ceph-mon[75204]: 4.1e scrub ok
Dec 03 21:10:32 compute-0 ceph-mon[75204]: 3.1b scrub starts
Dec 03 21:10:32 compute-0 ceph-mon[75204]: 3.1b scrub ok
Dec 03 21:10:32 compute-0 ceph-mon[75204]: 2.b scrub starts
Dec 03 21:10:32 compute-0 ceph-mon[75204]: 2.b scrub ok
Dec 03 21:10:32 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v97: 177 pgs: 1 peering, 31 unknown, 145 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:33 compute-0 ceph-mon[75204]: pgmap v97: 177 pgs: 1 peering, 31 unknown, 145 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:34 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec 03 21:10:34 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec 03 21:10:34 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.a scrub starts
Dec 03 21:10:34 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.a scrub ok
Dec 03 21:10:34 compute-0 ceph-mon[75204]: 3.a scrub starts
Dec 03 21:10:34 compute-0 ceph-mon[75204]: 3.a scrub ok
Dec 03 21:10:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v98: 177 pgs: 1 peering, 31 unknown, 145 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:35 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Dec 03 21:10:35 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Dec 03 21:10:35 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec 03 21:10:35 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec 03 21:10:35 compute-0 ceph-mon[75204]: 4.1c scrub starts
Dec 03 21:10:35 compute-0 ceph-mon[75204]: 4.1c scrub ok
Dec 03 21:10:35 compute-0 ceph-mon[75204]: pgmap v98: 177 pgs: 1 peering, 31 unknown, 145 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:35 compute-0 ceph-mon[75204]: 3.8 scrub starts
Dec 03 21:10:35 compute-0 ceph-mon[75204]: 3.8 scrub ok
Dec 03 21:10:36 compute-0 ceph-mon[75204]: 4.8 scrub starts
Dec 03 21:10:36 compute-0 ceph-mon[75204]: 4.8 scrub ok
Dec 03 21:10:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v99: 177 pgs: 177 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"} v 0)
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"} v 0)
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"} v 0)
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"} v 0)
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"} v 0)
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"} v 0)
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Dec 03 21:10:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Dec 03 21:10:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e40 do_prune osdmap full prune enabled
Dec 03 21:10:37 compute-0 ceph-mon[75204]: pgmap v99: 177 pgs: 177 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"} : dispatch
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e41 e41: 3 total, 3 up, 3 in
Dec 03 21:10:37 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e41: 3 total, 3 up, 3 in
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888752937s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577644348s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888706207s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577644348s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888567924s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577659607s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888529778s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577659607s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.894572258s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.583770752s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.894546509s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.583770752s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899888039s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589317322s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899865150s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589317322s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888037682s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577720642s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888017654s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577720642s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888173103s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577987671s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888152122s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577987671s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887830734s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577682495s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887809753s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577682495s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887775421s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577735901s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887760162s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577735901s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899380684s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589401245s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888262749s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578292847s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888237953s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578292847s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899343491s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589401245s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899204254s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589408875s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887511253s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577743530s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887497902s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577743530s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899168015s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589408875s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899053574s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589332581s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899025917s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589332581s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899174690s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589561462s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899160385s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589561462s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887281418s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577758789s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887357712s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577850342s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887332916s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577850342s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887207985s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577751160s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887244225s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577758789s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887195587s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577751160s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898865700s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589576721s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898849487s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887292862s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578041077s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887272835s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578041077s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887163162s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578002930s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887149811s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578002930s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898716927s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589614868s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887128830s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578063965s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898696899s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589614868s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887117386s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578063965s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887083054s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578102112s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887088776s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578125000s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887045860s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578102112s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887156487s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578285217s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887145042s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886906624s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578125000s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886973381s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578285217s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886903763s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886874199s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578308105s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886862755s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578308105s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886787415s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578254700s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886762619s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578254700s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.d( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.f( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.2( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.4( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.1( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884406090s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348602295s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880822182s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.345054626s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884387016s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348602295s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880803108s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.345054626s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853688240s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.318038940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853674889s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.7( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853358269s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.318038940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.5( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853232384s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.9( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.8( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.14( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.12( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.10( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.867569923s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975608826s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.929427147s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037498474s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.867554665s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975608826s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.867215157s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975303650s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.929405212s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037498474s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.867195129s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975303650s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.929409981s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037582397s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.929397583s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037582397s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1d( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.1b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852234840s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317932129s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.866736412s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975479126s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.866722107s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975479126s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.866712570s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975585938s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852218628s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317932129s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.866684914s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975585938s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928505898s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037483215s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928495407s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037483215s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.866079330s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975204468s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.866065025s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975204468s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.866003036s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975250244s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.865991592s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975250244s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928308487s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037651062s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.851003647s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317916870s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850990295s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317916870s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881216049s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348243713s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881206512s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348243713s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850802422s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317840576s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850778580s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317840576s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850716591s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317825317s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850179672s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880301476s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348007202s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850170135s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317924500s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880270004s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348007202s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880525589s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348281860s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850155830s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317924500s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928297997s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037651062s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.865748405s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975204468s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.865737915s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975204468s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.865626335s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975189209s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.865614891s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975189209s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880499840s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348281860s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880378723s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348236084s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880367279s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348236084s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880339622s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348297119s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849840164s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317825317s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880279541s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348289490s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849815369s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880253792s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348289490s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880223274s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348297119s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849054337s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317298889s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849026680s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317298889s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848933220s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317253113s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880013466s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348335266s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848918915s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317253113s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879985809s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348335266s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880131721s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348648071s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880120277s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348648071s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848572731s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317153931s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848423004s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317047119s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848547935s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317153931s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880009651s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348655701s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879998207s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348655701s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848355293s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317047119s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879922867s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348686218s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848220825s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317001343s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879914284s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848196983s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317001343s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879804611s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348670959s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848138809s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317008972s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879792213s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348670959s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848105431s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317008972s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848002434s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316963196s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847991943s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316963196s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879694939s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348716736s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879670143s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348716736s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879558563s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348678589s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879547119s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348678589s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847544670s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316764832s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879440308s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348686218s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879426003s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847513199s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316764832s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847468376s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316741943s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847443581s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316741943s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847066879s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316398621s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847057343s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316398621s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846938133s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316390991s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846894264s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316352844s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846924782s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316390991s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879432678s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348930359s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846867561s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316352844s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879405975s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348930359s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843698502s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.313354492s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879385948s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.349082947s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843666077s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.313354492s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846693993s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316413879s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879362106s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349082947s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846668243s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316413879s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879122734s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348976135s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879112244s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348976135s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847078323s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316993713s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879151344s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.349098206s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847055435s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316993713s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879140854s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349098206s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928034782s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037666321s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928025246s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037666321s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927963257s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037696838s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927952766s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037696838s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.865342140s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975158691s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.865333557s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975158691s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927793503s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037719727s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927783012s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037719727s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927693367s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037719727s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927684784s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037719727s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927641869s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037757874s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927632332s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037757874s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.929178238s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039398193s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.929169655s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039398193s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864471436s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974800110s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864461899s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974800110s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927361488s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037757874s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927351952s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037757874s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864676476s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975151062s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864668846s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975151062s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864478111s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975067139s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864466667s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975067139s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928797722s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039489746s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928786278s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039489746s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864211082s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974967957s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864202499s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974967957s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928587914s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039421082s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928578377s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039421082s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863828659s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974784851s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863812447s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928451538s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039451599s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928428650s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039451599s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863724709s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974784851s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863713264s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928549767s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039695740s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928540230s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039695740s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863598824s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974769592s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863583565s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974769592s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863542557s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974784851s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863531113s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928400040s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039718628s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928479195s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039802551s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928391457s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039718628s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928464890s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039802551s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858943939s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.970329285s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858925819s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970329285s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928402901s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039848328s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928389549s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039848328s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863302231s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974792480s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863292694s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974792480s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858806610s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.970382690s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928353310s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039932251s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858794212s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970382690s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928339958s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039932251s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928306580s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039985657s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858775139s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.970458984s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928293228s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039985657s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858759880s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970458984s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:37 compute-0 sudo[96329]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:38 compute-0 sshd-session[95963]: Connection closed by 192.168.122.30 port 38176
Dec 03 21:10:38 compute-0 sshd-session[95960]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:10:38 compute-0 systemd[76584]: Starting Mark boot as successful...
Dec 03 21:10:38 compute-0 systemd[76584]: Finished Mark boot as successful.
Dec 03 21:10:38 compute-0 systemd-logind[787]: Session 33 logged out. Waiting for processes to exit.
Dec 03 21:10:38 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Dec 03 21:10:38 compute-0 systemd[1]: session-33.scope: Consumed 8.847s CPU time.
Dec 03 21:10:38 compute-0 systemd-logind[787]: Removed session 33.
Dec 03 21:10:38 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Dec 03 21:10:38 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Dec 03 21:10:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e41 do_prune osdmap full prune enabled
Dec 03 21:10:38 compute-0 ceph-mon[75204]: 2.1c scrub starts
Dec 03 21:10:38 compute-0 ceph-mon[75204]: 2.1c scrub ok
Dec 03 21:10:38 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:38 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:38 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Dec 03 21:10:38 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:38 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:38 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 03 21:10:38 compute-0 ceph-mon[75204]: osdmap e41: 3 total, 3 up, 3 in
Dec 03 21:10:38 compute-0 ceph-mon[75204]: 7.1e scrub starts
Dec 03 21:10:38 compute-0 ceph-mon[75204]: 7.1e scrub ok
Dec 03 21:10:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e42 e42: 3 total, 3 up, 3 in
Dec 03 21:10:38 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e42: 3 total, 3 up, 3 in
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:38 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v102: 177 pgs: 177 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"} v 0)
Dec 03 21:10:39 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"} : dispatch
Dec 03 21:10:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e42 do_prune osdmap full prune enabled
Dec 03 21:10:39 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Dec 03 21:10:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e43 e43: 3 total, 3 up, 3 in
Dec 03 21:10:39 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e43: 3 total, 3 up, 3 in
Dec 03 21:10:39 compute-0 ceph-mon[75204]: osdmap e42: 3 total, 3 up, 3 in
Dec 03 21:10:39 compute-0 ceph-mon[75204]: pgmap v102: 177 pgs: 177 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:10:39 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"} : dispatch
Dec 03 21:10:39 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759346962s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 active pruub 102.589576721s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:39 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759279251s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 active pruub 102.589569092s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:39 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759297371s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:39 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759241104s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589569092s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:39 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759024620s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 active pruub 102.589576721s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:39 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.753028870s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 active pruub 102.583694458s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:39 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.758980751s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:39 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.752966881s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.583694458s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:39 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:39 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:39 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:39 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:40 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Dec 03 21:10:40 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Dec 03 21:10:40 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec 03 21:10:40 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec 03 21:10:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e43 do_prune osdmap full prune enabled
Dec 03 21:10:40 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Dec 03 21:10:40 compute-0 ceph-mon[75204]: osdmap e43: 3 total, 3 up, 3 in
Dec 03 21:10:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e44 e44: 3 total, 3 up, 3 in
Dec 03 21:10:40 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e44: 3 total, 3 up, 3 in
Dec 03 21:10:40 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:40 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:40 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:40 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v105: 177 pgs: 4 peering, 173 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 201 B/s, 2 keys/s, 2 objects/s recovering
Dec 03 21:10:41 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Dec 03 21:10:41 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Dec 03 21:10:41 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Dec 03 21:10:41 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Dec 03 21:10:41 compute-0 ceph-mon[75204]: 5.1c scrub starts
Dec 03 21:10:41 compute-0 ceph-mon[75204]: 5.1c scrub ok
Dec 03 21:10:41 compute-0 ceph-mon[75204]: 4.b scrub starts
Dec 03 21:10:41 compute-0 ceph-mon[75204]: 4.b scrub ok
Dec 03 21:10:41 compute-0 ceph-mon[75204]: osdmap e44: 3 total, 3 up, 3 in
Dec 03 21:10:41 compute-0 ceph-mon[75204]: pgmap v105: 177 pgs: 4 peering, 173 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 201 B/s, 2 keys/s, 2 objects/s recovering
Dec 03 21:10:41 compute-0 ceph-mon[75204]: 3.1a scrub starts
Dec 03 21:10:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:42 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Dec 03 21:10:42 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Dec 03 21:10:42 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Dec 03 21:10:42 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Dec 03 21:10:42 compute-0 ceph-mon[75204]: 2.1a scrub starts
Dec 03 21:10:42 compute-0 ceph-mon[75204]: 2.1a scrub ok
Dec 03 21:10:42 compute-0 ceph-mon[75204]: 3.1a scrub ok
Dec 03 21:10:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v106: 177 pgs: 4 peering, 173 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 145 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Dec 03 21:10:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Dec 03 21:10:43 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec 03 21:10:43 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec 03 21:10:43 compute-0 ceph-mon[75204]: 5.1f scrub starts
Dec 03 21:10:43 compute-0 ceph-mon[75204]: 5.1f scrub ok
Dec 03 21:10:43 compute-0 ceph-mon[75204]: 4.6 scrub starts
Dec 03 21:10:43 compute-0 ceph-mon[75204]: 4.6 scrub ok
Dec 03 21:10:43 compute-0 ceph-mon[75204]: pgmap v106: 177 pgs: 4 peering, 173 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 145 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:44 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec 03 21:10:44 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec 03 21:10:44 compute-0 ceph-mon[75204]: 5.10 scrub starts
Dec 03 21:10:44 compute-0 ceph-mon[75204]: 5.10 scrub ok
Dec 03 21:10:44 compute-0 ceph-mon[75204]: 7.1d scrub starts
Dec 03 21:10:44 compute-0 ceph-mon[75204]: 7.1d scrub ok
Dec 03 21:10:44 compute-0 ceph-mon[75204]: 3.19 scrub starts
Dec 03 21:10:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v107: 177 pgs: 4 peering, 173 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 123 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:45 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Dec 03 21:10:45 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Dec 03 21:10:45 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Dec 03 21:10:45 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Dec 03 21:10:45 compute-0 ceph-mon[75204]: 3.19 scrub ok
Dec 03 21:10:45 compute-0 ceph-mon[75204]: pgmap v107: 177 pgs: 4 peering, 173 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 123 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:46 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Dec 03 21:10:46 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Dec 03 21:10:46 compute-0 ceph-mon[75204]: 2.14 scrub starts
Dec 03 21:10:46 compute-0 ceph-mon[75204]: 2.14 scrub ok
Dec 03 21:10:46 compute-0 ceph-mon[75204]: 4.19 scrub starts
Dec 03 21:10:46 compute-0 ceph-mon[75204]: 4.19 scrub ok
Dec 03 21:10:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v108: 177 pgs: 177 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 100 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"} v 0)
Dec 03 21:10:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"} : dispatch
Dec 03 21:10:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Dec 03 21:10:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Dec 03 21:10:47 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Dec 03 21:10:47 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Dec 03 21:10:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e44 do_prune osdmap full prune enabled
Dec 03 21:10:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Dec 03 21:10:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e45 e45: 3 total, 3 up, 3 in
Dec 03 21:10:47 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e45: 3 total, 3 up, 3 in
Dec 03 21:10:47 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.897034645s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 active pruub 108.124214172s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:47 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896965027s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124214172s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:47 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900068283s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 active pruub 108.127418518s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:47 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900019646s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127418518s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:47 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900277138s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 active pruub 108.127799988s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:47 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900236130s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127799988s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:47 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896610260s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 active pruub 108.124343872s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:47 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896554947s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124343872s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:47 compute-0 ceph-mon[75204]: 4.3 scrub starts
Dec 03 21:10:47 compute-0 ceph-mon[75204]: 4.3 scrub ok
Dec 03 21:10:47 compute-0 ceph-mon[75204]: pgmap v108: 177 pgs: 177 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 100 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:47 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"} : dispatch
Dec 03 21:10:47 compute-0 ceph-mon[75204]: 7.12 scrub starts
Dec 03 21:10:47 compute-0 ceph-mon[75204]: 7.12 scrub ok
Dec 03 21:10:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:48 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Dec 03 21:10:48 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Dec 03 21:10:48 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.c scrub starts
Dec 03 21:10:48 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.c scrub ok
Dec 03 21:10:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e45 do_prune osdmap full prune enabled
Dec 03 21:10:48 compute-0 ceph-mon[75204]: 4.0 scrub starts
Dec 03 21:10:48 compute-0 ceph-mon[75204]: 4.0 scrub ok
Dec 03 21:10:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Dec 03 21:10:48 compute-0 ceph-mon[75204]: osdmap e45: 3 total, 3 up, 3 in
Dec 03 21:10:48 compute-0 ceph-mon[75204]: 2.12 scrub starts
Dec 03 21:10:48 compute-0 ceph-mon[75204]: 2.12 scrub ok
Dec 03 21:10:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e46 e46: 3 total, 3 up, 3 in
Dec 03 21:10:48 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e46: 3 total, 3 up, 3 in
Dec 03 21:10:48 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:48 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:48 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:48 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v111: 177 pgs: 177 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 03 21:10:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"} v 0)
Dec 03 21:10:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"} : dispatch
Dec 03 21:10:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e46 do_prune osdmap full prune enabled
Dec 03 21:10:49 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Dec 03 21:10:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e47 e47: 3 total, 3 up, 3 in
Dec 03 21:10:49 compute-0 ceph-mon[75204]: 4.c scrub starts
Dec 03 21:10:49 compute-0 ceph-mon[75204]: 4.c scrub ok
Dec 03 21:10:49 compute-0 ceph-mon[75204]: osdmap e46: 3 total, 3 up, 3 in
Dec 03 21:10:49 compute-0 ceph-mon[75204]: pgmap v111: 177 pgs: 177 active+clean; 451 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 03 21:10:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"} : dispatch
Dec 03 21:10:49 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e47: 3 total, 3 up, 3 in
Dec 03 21:10:50 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Dec 03 21:10:50 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Dec 03 21:10:50 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Dec 03 21:10:50 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Dec 03 21:10:50 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec 03 21:10:50 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec 03 21:10:50 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Dec 03 21:10:50 compute-0 ceph-mon[75204]: osdmap e47: 3 total, 3 up, 3 in
Dec 03 21:10:50 compute-0 ceph-mon[75204]: 2.10 scrub starts
Dec 03 21:10:50 compute-0 ceph-mon[75204]: 2.10 scrub ok
Dec 03 21:10:50 compute-0 ceph-mon[75204]: 3.14 scrub starts
Dec 03 21:10:50 compute-0 ceph-mon[75204]: 3.14 scrub ok
Dec 03 21:10:51 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec 03 21:10:51 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec 03 21:10:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v113: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 107 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"} v 0)
Dec 03 21:10:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"} : dispatch
Dec 03 21:10:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:10:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:10:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:10:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:10:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:10:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:10:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e47 do_prune osdmap full prune enabled
Dec 03 21:10:51 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Dec 03 21:10:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e48 e48: 3 total, 3 up, 3 in
Dec 03 21:10:51 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e48: 3 total, 3 up, 3 in
Dec 03 21:10:51 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850582123s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 active pruub 108.127700806s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:51 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846266747s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 active pruub 108.123405457s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:51 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850494385s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.127700806s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:51 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846174240s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.123405457s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:51 compute-0 ceph-mon[75204]: 4.17 scrub starts
Dec 03 21:10:51 compute-0 ceph-mon[75204]: 4.17 scrub ok
Dec 03 21:10:51 compute-0 ceph-mon[75204]: 5.17 scrub starts
Dec 03 21:10:51 compute-0 ceph-mon[75204]: 5.17 scrub ok
Dec 03 21:10:51 compute-0 ceph-mon[75204]: pgmap v113: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 107 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:51 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"} : dispatch
Dec 03 21:10:51 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:51 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:51 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712948799s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 active pruub 110.589859009s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:51 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712782860s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.589859009s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:51 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:51 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.706089020s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 active pruub 110.583999634s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:51 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.705857277s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.583999634s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:51 compute-0 ceph-mgr[75500]: [progress INFO root] Completed event a9d95566-a29f-4277-9ca9-f7ef602a478d (Global Recovery Event) in 25 seconds
Dec 03 21:10:51 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:52 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec 03 21:10:52 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec 03 21:10:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e48 do_prune osdmap full prune enabled
Dec 03 21:10:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e49 e49: 3 total, 3 up, 3 in
Dec 03 21:10:52 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e49: 3 total, 3 up, 3 in
Dec 03 21:10:52 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:52 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Dec 03 21:10:52 compute-0 ceph-mon[75204]: osdmap e48: 3 total, 3 up, 3 in
Dec 03 21:10:52 compute-0 ceph-mon[75204]: 5.8 scrub starts
Dec 03 21:10:52 compute-0 ceph-mon[75204]: 5.8 scrub ok
Dec 03 21:10:52 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:52 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:52 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v116: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 147 B/s, 2 keys/s, 1 objects/s recovering
Dec 03 21:10:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"} v 0)
Dec 03 21:10:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"} : dispatch
Dec 03 21:10:53 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.10 scrub starts
Dec 03 21:10:53 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.10 scrub ok
Dec 03 21:10:53 compute-0 sshd-session[96390]: Accepted publickey for zuul from 192.168.122.30 port 59376 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:10:53 compute-0 systemd-logind[787]: New session 34 of user zuul.
Dec 03 21:10:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e49 do_prune osdmap full prune enabled
Dec 03 21:10:53 compute-0 systemd[1]: Started Session 34 of User zuul.
Dec 03 21:10:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Dec 03 21:10:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e50 e50: 3 total, 3 up, 3 in
Dec 03 21:10:53 compute-0 sshd-session[96390]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:10:53 compute-0 ceph-mon[75204]: osdmap e49: 3 total, 3 up, 3 in
Dec 03 21:10:53 compute-0 ceph-mon[75204]: pgmap v116: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 147 B/s, 2 keys/s, 1 objects/s recovering
Dec 03 21:10:53 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"} : dispatch
Dec 03 21:10:53 compute-0 ceph-mon[75204]: 7.10 scrub starts
Dec 03 21:10:53 compute-0 ceph-mon[75204]: 7.10 scrub ok
Dec 03 21:10:53 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e50: 3 total, 3 up, 3 in
Dec 03 21:10:54 compute-0 python3.9[96543]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 03 21:10:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Dec 03 21:10:54 compute-0 ceph-mon[75204]: osdmap e50: 3 total, 3 up, 3 in
Dec 03 21:10:55 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.e scrub starts
Dec 03 21:10:55 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.e scrub ok
Dec 03 21:10:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v118: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 120 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"} v 0)
Dec 03 21:10:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"} : dispatch
Dec 03 21:10:55 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec 03 21:10:55 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec 03 21:10:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e50 do_prune osdmap full prune enabled
Dec 03 21:10:55 compute-0 ceph-mon[75204]: 2.e scrub starts
Dec 03 21:10:55 compute-0 ceph-mon[75204]: 2.e scrub ok
Dec 03 21:10:55 compute-0 ceph-mon[75204]: pgmap v118: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 120 B/s, 1 keys/s, 1 objects/s recovering
Dec 03 21:10:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"} : dispatch
Dec 03 21:10:55 compute-0 ceph-mon[75204]: 3.13 scrub starts
Dec 03 21:10:55 compute-0 ceph-mon[75204]: 3.13 scrub ok
Dec 03 21:10:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Dec 03 21:10:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e51 e51: 3 total, 3 up, 3 in
Dec 03 21:10:55 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e51: 3 total, 3 up, 3 in
Dec 03 21:10:56 compute-0 python3.9[96717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:10:56 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec 03 21:10:56 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec 03 21:10:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:10:56 compute-0 ceph-mgr[75500]: [progress INFO root] Writing back 11 completed events
Dec 03 21:10:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Dec 03 21:10:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Dec 03 21:10:56 compute-0 ceph-mon[75204]: osdmap e51: 3 total, 3 up, 3 in
Dec 03 21:10:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:10:57 compute-0 sudo[96871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhlitrujhegqqbxegfkajgmzbdxrmpts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796256.5448804-45-157958633484797/AnsiballZ_command.py'
Dec 03 21:10:57 compute-0 sudo[96871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v120: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 331 B/s, 1 objects/s recovering
Dec 03 21:10:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"} v 0)
Dec 03 21:10:57 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"} : dispatch
Dec 03 21:10:57 compute-0 python3.9[96873]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:10:57 compute-0 sudo[96871]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e51 do_prune osdmap full prune enabled
Dec 03 21:10:57 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Dec 03 21:10:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e52 e52: 3 total, 3 up, 3 in
Dec 03 21:10:57 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e52: 3 total, 3 up, 3 in
Dec 03 21:10:57 compute-0 ceph-mon[75204]: 4.16 scrub starts
Dec 03 21:10:57 compute-0 ceph-mon[75204]: 4.16 scrub ok
Dec 03 21:10:57 compute-0 ceph-mon[75204]: pgmap v120: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 331 B/s, 1 objects/s recovering
Dec 03 21:10:57 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"} : dispatch
Dec 03 21:10:58 compute-0 sudo[97024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwpbufcrpnsenjyogvgynyadjbdsgecz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796257.65009-57-29547763035420/AnsiballZ_stat.py'
Dec 03 21:10:58 compute-0 sudo[97024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:58 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Dec 03 21:10:58 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Dec 03 21:10:58 compute-0 python3.9[97026]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:10:58 compute-0 sudo[97024]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:58 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Dec 03 21:10:58 compute-0 ceph-mon[75204]: osdmap e52: 3 total, 3 up, 3 in
Dec 03 21:10:59 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355226517s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 active pruub 118.589836121s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:10:59 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355180740s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.589836121s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:10:59 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:10:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v122: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 296 B/s, 1 objects/s recovering
Dec 03 21:10:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"} v 0)
Dec 03 21:10:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"} : dispatch
Dec 03 21:10:59 compute-0 sudo[97178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucbjmnwhsaekxrpdearbruomdcwoktpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796258.724237-68-108929528256056/AnsiballZ_file.py'
Dec 03 21:10:59 compute-0 sudo[97178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:10:59 compute-0 python3.9[97180]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:10:59 compute-0 sudo[97178]: pam_unix(sudo:session): session closed for user root
Dec 03 21:10:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e52 do_prune osdmap full prune enabled
Dec 03 21:10:59 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Dec 03 21:10:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e53 e53: 3 total, 3 up, 3 in
Dec 03 21:10:59 compute-0 ceph-mon[75204]: 4.15 scrub starts
Dec 03 21:10:59 compute-0 ceph-mon[75204]: 4.15 scrub ok
Dec 03 21:10:59 compute-0 ceph-mon[75204]: pgmap v122: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 296 B/s, 1 objects/s recovering
Dec 03 21:10:59 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"} : dispatch
Dec 03 21:10:59 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:10:59 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e53: 3 total, 3 up, 3 in
Dec 03 21:11:00 compute-0 sudo[97330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdmmzzrnvkefkdhnfctwwwfrmncqwnvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796259.693195-77-202567588668923/AnsiballZ_file.py'
Dec 03 21:11:00 compute-0 sudo[97330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:11:00 compute-0 python3.9[97332]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:11:00 compute-0 sudo[97330]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:00 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Dec 03 21:11:00 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Dec 03 21:11:00 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Dec 03 21:11:00 compute-0 ceph-mon[75204]: osdmap e53: 3 total, 3 up, 3 in
Dec 03 21:11:00 compute-0 ceph-mon[75204]: 7.17 scrub starts
Dec 03 21:11:00 compute-0 ceph-mon[75204]: 7.17 scrub ok
Dec 03 21:11:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v124: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 296 B/s, 1 objects/s recovering
Dec 03 21:11:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"} v 0)
Dec 03 21:11:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"} : dispatch
Dec 03 21:11:01 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec 03 21:11:01 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec 03 21:11:01 compute-0 python3.9[97482]: ansible-ansible.builtin.service_facts Invoked
Dec 03 21:11:01 compute-0 network[97499]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:11:01 compute-0 network[97500]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:11:01 compute-0 network[97501]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:11:01 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.090121269s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 active pruub 116.124732971s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:11:01 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.089967728s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 116.124732971s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:11:01 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:11:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e53 do_prune osdmap full prune enabled
Dec 03 21:11:01 compute-0 ceph-mon[75204]: pgmap v124: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 296 B/s, 1 objects/s recovering
Dec 03 21:11:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"} : dispatch
Dec 03 21:11:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Dec 03 21:11:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e54 e54: 3 total, 3 up, 3 in
Dec 03 21:11:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e54: 3 total, 3 up, 3 in
Dec 03 21:11:01 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742419243s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 active pruub 118.168533325s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:11:01 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742380142s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.168533325s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:11:01 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:11:01 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:11:02 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec 03 21:11:02 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec 03 21:11:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e54 do_prune osdmap full prune enabled
Dec 03 21:11:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e55 e55: 3 total, 3 up, 3 in
Dec 03 21:11:02 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e55: 3 total, 3 up, 3 in
Dec 03 21:11:02 compute-0 ceph-mon[75204]: 2.13 scrub starts
Dec 03 21:11:02 compute-0 ceph-mon[75204]: 2.13 scrub ok
Dec 03 21:11:02 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Dec 03 21:11:02 compute-0 ceph-mon[75204]: osdmap e54: 3 total, 3 up, 3 in
Dec 03 21:11:02 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=54/55 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:11:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v127: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:03 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"} v 0)
Dec 03 21:11:03 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"} : dispatch
Dec 03 21:11:03 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e55 do_prune osdmap full prune enabled
Dec 03 21:11:03 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Dec 03 21:11:03 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e56 e56: 3 total, 3 up, 3 in
Dec 03 21:11:03 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e56: 3 total, 3 up, 3 in
Dec 03 21:11:03 compute-0 ceph-mon[75204]: 5.14 scrub starts
Dec 03 21:11:03 compute-0 ceph-mon[75204]: 5.14 scrub ok
Dec 03 21:11:03 compute-0 ceph-mon[75204]: osdmap e55: 3 total, 3 up, 3 in
Dec 03 21:11:03 compute-0 ceph-mon[75204]: pgmap v127: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:03 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"} : dispatch
Dec 03 21:11:04 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec 03 21:11:04 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec 03 21:11:04 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Dec 03 21:11:04 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Dec 03 21:11:04 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310461044s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 active pruub 122.845947266s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:11:04 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310279846s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY pruub 122.845947266s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:11:04 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:11:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e56 do_prune osdmap full prune enabled
Dec 03 21:11:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e57 e57: 3 total, 3 up, 3 in
Dec 03 21:11:04 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e57: 3 total, 3 up, 3 in
Dec 03 21:11:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Dec 03 21:11:05 compute-0 ceph-mon[75204]: osdmap e56: 3 total, 3 up, 3 in
Dec 03 21:11:05 compute-0 ceph-mon[75204]: 5.a scrub starts
Dec 03 21:11:05 compute-0 ceph-mon[75204]: 5.a scrub ok
Dec 03 21:11:05 compute-0 ceph-mon[75204]: 7.16 scrub starts
Dec 03 21:11:05 compute-0 ceph-mon[75204]: 7.16 scrub ok
Dec 03 21:11:05 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:11:05 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec 03 21:11:05 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec 03 21:11:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v130: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"} v 0)
Dec 03 21:11:05 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"} : dispatch
Dec 03 21:11:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e57 do_prune osdmap full prune enabled
Dec 03 21:11:06 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Dec 03 21:11:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e58 e58: 3 total, 3 up, 3 in
Dec 03 21:11:06 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e58: 3 total, 3 up, 3 in
Dec 03 21:11:06 compute-0 ceph-mon[75204]: osdmap e57: 3 total, 3 up, 3 in
Dec 03 21:11:06 compute-0 ceph-mon[75204]: 5.b scrub starts
Dec 03 21:11:06 compute-0 ceph-mon[75204]: 5.b scrub ok
Dec 03 21:11:06 compute-0 ceph-mon[75204]: pgmap v130: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:06 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"} : dispatch
Dec 03 21:11:06 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.c scrub starts
Dec 03 21:11:06 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.c scrub ok
Dec 03 21:11:06 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Dec 03 21:11:06 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Dec 03 21:11:06 compute-0 python3.9[97761]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:11:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e58 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:07 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Dec 03 21:11:07 compute-0 ceph-mon[75204]: osdmap e58: 3 total, 3 up, 3 in
Dec 03 21:11:07 compute-0 ceph-mon[75204]: 2.c scrub starts
Dec 03 21:11:07 compute-0 ceph-mon[75204]: 2.c scrub ok
Dec 03 21:11:07 compute-0 ceph-mon[75204]: 3.10 scrub starts
Dec 03 21:11:07 compute-0 ceph-mon[75204]: 3.10 scrub ok
Dec 03 21:11:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v132: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 03 21:11:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"} v 0)
Dec 03 21:11:07 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"} : dispatch
Dec 03 21:11:07 compute-0 python3.9[97911]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:11:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e58 do_prune osdmap full prune enabled
Dec 03 21:11:08 compute-0 ceph-mon[75204]: pgmap v132: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 03 21:11:08 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"} : dispatch
Dec 03 21:11:08 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Dec 03 21:11:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e59 e59: 3 total, 3 up, 3 in
Dec 03 21:11:08 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e59: 3 total, 3 up, 3 in
Dec 03 21:11:08 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec 03 21:11:08 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec 03 21:11:08 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec 03 21:11:08 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec 03 21:11:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Dec 03 21:11:09 compute-0 ceph-mon[75204]: osdmap e59: 3 total, 3 up, 3 in
Dec 03 21:11:09 compute-0 ceph-mon[75204]: 7.14 scrub starts
Dec 03 21:11:09 compute-0 ceph-mon[75204]: 7.14 scrub ok
Dec 03 21:11:09 compute-0 ceph-mon[75204]: 3.12 scrub starts
Dec 03 21:11:09 compute-0 ceph-mon[75204]: 3.12 scrub ok
Dec 03 21:11:09 compute-0 python3.9[98065]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:11:09 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.0 scrub starts
Dec 03 21:11:09 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.0 scrub ok
Dec 03 21:11:09 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654327393s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 active pruub 134.886367798s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:11:09 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654233932s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY pruub 134.886367798s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:11:09 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:11:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v134: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 03 21:11:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"} v 0)
Dec 03 21:11:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"} : dispatch
Dec 03 21:11:09 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec 03 21:11:09 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec 03 21:11:09 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Dec 03 21:11:09 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Dec 03 21:11:10 compute-0 sudo[98221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgeupgncopkqrmpvhhxggfzxqaxgkosp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796269.566712-125-78436423023335/AnsiballZ_setup.py'
Dec 03 21:11:10 compute-0 sudo[98221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:11:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e59 do_prune osdmap full prune enabled
Dec 03 21:11:10 compute-0 ceph-mon[75204]: 2.0 scrub starts
Dec 03 21:11:10 compute-0 ceph-mon[75204]: 2.0 scrub ok
Dec 03 21:11:10 compute-0 ceph-mon[75204]: pgmap v134: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 03 21:11:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"} : dispatch
Dec 03 21:11:10 compute-0 ceph-mon[75204]: 7.b scrub starts
Dec 03 21:11:10 compute-0 ceph-mon[75204]: 7.b scrub ok
Dec 03 21:11:10 compute-0 ceph-mon[75204]: 5.15 scrub starts
Dec 03 21:11:10 compute-0 ceph-mon[75204]: 5.15 scrub ok
Dec 03 21:11:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Dec 03 21:11:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e60 e60: 3 total, 3 up, 3 in
Dec 03 21:11:10 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e60: 3 total, 3 up, 3 in
Dec 03 21:11:10 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Dec 03 21:11:10 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:11:10 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Dec 03 21:11:10 compute-0 python3.9[98223]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:11:10 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.d scrub starts
Dec 03 21:11:10 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.d scrub ok
Dec 03 21:11:10 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Dec 03 21:11:10 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Dec 03 21:11:10 compute-0 sudo[98221]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:11 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Dec 03 21:11:11 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Dec 03 21:11:11 compute-0 sudo[98255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:11:11 compute-0 sudo[98255]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:11 compute-0 sudo[98255]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Dec 03 21:11:11 compute-0 ceph-mon[75204]: osdmap e60: 3 total, 3 up, 3 in
Dec 03 21:11:11 compute-0 ceph-mon[75204]: 5.0 scrub starts
Dec 03 21:11:11 compute-0 ceph-mon[75204]: 5.0 scrub ok
Dec 03 21:11:11 compute-0 ceph-mon[75204]: 3.d scrub starts
Dec 03 21:11:11 compute-0 ceph-mon[75204]: 3.d scrub ok
Dec 03 21:11:11 compute-0 ceph-mon[75204]: 2.11 scrub starts
Dec 03 21:11:11 compute-0 ceph-mon[75204]: 2.11 scrub ok
Dec 03 21:11:11 compute-0 sudo[98304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:11:11 compute-0 sudo[98304]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:11 compute-0 sudo[98355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxpjwisjouvmonqnyurtlozrqidwbzbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796269.566712-125-78436423023335/AnsiballZ_dnf.py'
Dec 03 21:11:11 compute-0 sudo[98355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:11:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v136: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 11 B/s, 0 objects/s recovering
Dec 03 21:11:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"} v 0)
Dec 03 21:11:11 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"} : dispatch
Dec 03 21:11:11 compute-0 python3.9[98357]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:11:11 compute-0 sudo[98304]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:11:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:11:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:11:11 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:11:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:11:11 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:11:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:11:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:11:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:11:11 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:11:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:11:11 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:11:11 compute-0 sudo[98393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:11:11 compute-0 sudo[98393]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:11 compute-0 sudo[98393]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:11 compute-0 sudo[98421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:11:11 compute-0 sudo[98421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:12 compute-0 podman[98461]: 2025-12-03 21:11:12.146527373 +0000 UTC m=+0.028518753 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:11:12 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec 03 21:11:12 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec 03 21:11:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e60 do_prune osdmap full prune enabled
Dec 03 21:11:12 compute-0 podman[98461]: 2025-12-03 21:11:12.383750331 +0000 UTC m=+0.265741721 container create 864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_cartwright, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle)
Dec 03 21:11:12 compute-0 ceph-mon[75204]: 2.1 scrub starts
Dec 03 21:11:12 compute-0 ceph-mon[75204]: 2.1 scrub ok
Dec 03 21:11:12 compute-0 ceph-mon[75204]: pgmap v136: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 11 B/s, 0 objects/s recovering
Dec 03 21:11:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"} : dispatch
Dec 03 21:11:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:11:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:11:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:11:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:11:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:11:12 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:11:12 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Dec 03 21:11:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e61 e61: 3 total, 3 up, 3 in
Dec 03 21:11:12 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e61: 3 total, 3 up, 3 in
Dec 03 21:11:12 compute-0 systemd[1]: Started libpod-conmon-864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0.scope.
Dec 03 21:11:12 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:11:12 compute-0 podman[98461]: 2025-12-03 21:11:12.497198663 +0000 UTC m=+0.379190023 container init 864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_cartwright, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:11:12 compute-0 podman[98461]: 2025-12-03 21:11:12.509120391 +0000 UTC m=+0.391111721 container start 864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_cartwright, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:11:12 compute-0 podman[98461]: 2025-12-03 21:11:12.51318999 +0000 UTC m=+0.395181330 container attach 864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_cartwright, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:11:12 compute-0 loving_cartwright[98477]: 167 167
Dec 03 21:11:12 compute-0 systemd[1]: libpod-864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0.scope: Deactivated successfully.
Dec 03 21:11:12 compute-0 podman[98461]: 2025-12-03 21:11:12.519237481 +0000 UTC m=+0.401228831 container died 864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_cartwright, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 03 21:11:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-e910c4735405106ee38c599d905321d080d5a43b724138f5860b7aed94bde7f7-merged.mount: Deactivated successfully.
Dec 03 21:11:12 compute-0 podman[98461]: 2025-12-03 21:11:12.569641888 +0000 UTC m=+0.451633238 container remove 864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_cartwright, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS)
Dec 03 21:11:12 compute-0 systemd[1]: libpod-conmon-864a593e9800f142def0403a4fc1c2c018cfc3b2c5ab944af812ada680a9b7f0.scope: Deactivated successfully.
Dec 03 21:11:12 compute-0 podman[98507]: 2025-12-03 21:11:12.760241811 +0000 UTC m=+0.057309892 container create 6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_wing, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:11:12 compute-0 systemd[1]: Started libpod-conmon-6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff.scope.
Dec 03 21:11:12 compute-0 podman[98507]: 2025-12-03 21:11:12.728206015 +0000 UTC m=+0.025274196 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:11:12 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:11:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/636d53a40c06d2edca71bdb9e21cd9e67576272fa20bb585f01d12f9017dd7db/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/636d53a40c06d2edca71bdb9e21cd9e67576272fa20bb585f01d12f9017dd7db/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/636d53a40c06d2edca71bdb9e21cd9e67576272fa20bb585f01d12f9017dd7db/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/636d53a40c06d2edca71bdb9e21cd9e67576272fa20bb585f01d12f9017dd7db/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/636d53a40c06d2edca71bdb9e21cd9e67576272fa20bb585f01d12f9017dd7db/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:12 compute-0 podman[98507]: 2025-12-03 21:11:12.871260318 +0000 UTC m=+0.168328469 container init 6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_wing, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:11:12 compute-0 podman[98507]: 2025-12-03 21:11:12.880500004 +0000 UTC m=+0.177568085 container start 6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_wing, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default)
Dec 03 21:11:12 compute-0 podman[98507]: 2025-12-03 21:11:12.896698717 +0000 UTC m=+0.193766838 container attach 6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_wing, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:11:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v138: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 11 B/s, 0 objects/s recovering
Dec 03 21:11:13 compute-0 ceph-mon[75204]: 3.b scrub starts
Dec 03 21:11:13 compute-0 ceph-mon[75204]: 3.b scrub ok
Dec 03 21:11:13 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Dec 03 21:11:13 compute-0 ceph-mon[75204]: osdmap e61: 3 total, 3 up, 3 in
Dec 03 21:11:13 compute-0 xenodochial_wing[98527]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:11:13 compute-0 xenodochial_wing[98527]: --> All data devices are unavailable
Dec 03 21:11:13 compute-0 systemd[1]: libpod-6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff.scope: Deactivated successfully.
Dec 03 21:11:13 compute-0 podman[98507]: 2025-12-03 21:11:13.470752686 +0000 UTC m=+0.767820777 container died 6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_wing, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:11:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-636d53a40c06d2edca71bdb9e21cd9e67576272fa20bb585f01d12f9017dd7db-merged.mount: Deactivated successfully.
Dec 03 21:11:13 compute-0 podman[98507]: 2025-12-03 21:11:13.516503608 +0000 UTC m=+0.813571689 container remove 6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_wing, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default)
Dec 03 21:11:13 compute-0 systemd[1]: libpod-conmon-6ac9b58abc007d1f0d03fa834422483f2327593202c65900aa4bd17c520587ff.scope: Deactivated successfully.
Dec 03 21:11:13 compute-0 sudo[98421]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:13 compute-0 sudo[98583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:11:13 compute-0 sudo[98583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:13 compute-0 sudo[98583]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:13 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116504669s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 active pruub 138.846450806s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:11:13 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116334915s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY pruub 138.846450806s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:11:13 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:11:13 compute-0 sudo[98609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:11:13 compute-0 sudo[98609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:14 compute-0 podman[98654]: 2025-12-03 21:11:14.027669147 +0000 UTC m=+0.055947406 container create 83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_agnesi, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:11:14 compute-0 systemd[1]: Started libpod-conmon-83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0.scope.
Dec 03 21:11:14 compute-0 podman[98654]: 2025-12-03 21:11:14.008670109 +0000 UTC m=+0.036948428 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:11:14 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:11:14 compute-0 podman[98654]: 2025-12-03 21:11:14.13032906 +0000 UTC m=+0.158607409 container init 83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_agnesi, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:11:14 compute-0 podman[98654]: 2025-12-03 21:11:14.13669974 +0000 UTC m=+0.164977999 container start 83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_agnesi, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:11:14 compute-0 podman[98654]: 2025-12-03 21:11:14.141050266 +0000 UTC m=+0.169328565 container attach 83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_agnesi, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:11:14 compute-0 suspicious_agnesi[98673]: 167 167
Dec 03 21:11:14 compute-0 systemd[1]: libpod-83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0.scope: Deactivated successfully.
Dec 03 21:11:14 compute-0 podman[98654]: 2025-12-03 21:11:14.144637932 +0000 UTC m=+0.172916241 container died 83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_agnesi, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:11:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-bc8b685e6854a8299dc9fee8f05cebe99944a3a923c3b67768a0ab0729a7d9f9-merged.mount: Deactivated successfully.
Dec 03 21:11:14 compute-0 podman[98654]: 2025-12-03 21:11:14.188914195 +0000 UTC m=+0.217192494 container remove 83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_agnesi, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:11:14 compute-0 systemd[1]: libpod-conmon-83a450290d6027a44f1a4d3a80081baae0538379c97464d51e645e8d74e542a0.scope: Deactivated successfully.
Dec 03 21:11:14 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Dec 03 21:11:14 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Dec 03 21:11:14 compute-0 podman[98698]: 2025-12-03 21:11:14.408193614 +0000 UTC m=+0.051021824 container create ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_perlman, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:11:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e61 do_prune osdmap full prune enabled
Dec 03 21:11:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 e62: 3 total, 3 up, 3 in
Dec 03 21:11:14 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e62: 3 total, 3 up, 3 in
Dec 03 21:11:14 compute-0 ceph-mon[75204]: pgmap v138: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 11 B/s, 0 objects/s recovering
Dec 03 21:11:14 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:11:14 compute-0 systemd[1]: Started libpod-conmon-ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2.scope.
Dec 03 21:11:14 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:11:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d4ae90e71c93cf7f8cc69cf08c25ba5a5168b05326e7302e06d2d8613f83ce4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d4ae90e71c93cf7f8cc69cf08c25ba5a5168b05326e7302e06d2d8613f83ce4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d4ae90e71c93cf7f8cc69cf08c25ba5a5168b05326e7302e06d2d8613f83ce4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d4ae90e71c93cf7f8cc69cf08c25ba5a5168b05326e7302e06d2d8613f83ce4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:14 compute-0 podman[98698]: 2025-12-03 21:11:14.385105928 +0000 UTC m=+0.027934128 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:11:14 compute-0 podman[98698]: 2025-12-03 21:11:14.491286694 +0000 UTC m=+0.134114904 container init ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_perlman, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:11:14 compute-0 podman[98698]: 2025-12-03 21:11:14.511636038 +0000 UTC m=+0.154464218 container start ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_perlman, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:11:14 compute-0 podman[98698]: 2025-12-03 21:11:14.515159692 +0000 UTC m=+0.157987902 container attach ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_perlman, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]: {
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:     "0": [
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:         {
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "devices": [
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "/dev/loop3"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             ],
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_name": "ceph_lv0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_size": "21470642176",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "name": "ceph_lv0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "tags": {
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cluster_name": "ceph",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.crush_device_class": "",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.encrypted": "0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.objectstore": "bluestore",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osd_id": "0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.type": "block",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.vdo": "0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.with_tpm": "0"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             },
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "type": "block",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "vg_name": "ceph_vg0"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:         }
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:     ],
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:     "1": [
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:         {
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "devices": [
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "/dev/loop4"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             ],
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_name": "ceph_lv1",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_size": "21470642176",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "name": "ceph_lv1",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "tags": {
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cluster_name": "ceph",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.crush_device_class": "",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.encrypted": "0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.objectstore": "bluestore",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osd_id": "1",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.type": "block",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.vdo": "0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.with_tpm": "0"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             },
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "type": "block",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "vg_name": "ceph_vg1"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:         }
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:     ],
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:     "2": [
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:         {
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "devices": [
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "/dev/loop5"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             ],
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_name": "ceph_lv2",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_size": "21470642176",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "name": "ceph_lv2",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "tags": {
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.cluster_name": "ceph",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.crush_device_class": "",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.encrypted": "0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.objectstore": "bluestore",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osd_id": "2",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.type": "block",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.vdo": "0",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:                 "ceph.with_tpm": "0"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             },
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "type": "block",
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:             "vg_name": "ceph_vg2"
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:         }
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]:     ]
Dec 03 21:11:14 compute-0 optimistic_perlman[98715]: }
Dec 03 21:11:14 compute-0 systemd[1]: libpod-ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2.scope: Deactivated successfully.
Dec 03 21:11:14 compute-0 podman[98698]: 2025-12-03 21:11:14.818216143 +0000 UTC m=+0.461044383 container died ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_perlman, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:11:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-9d4ae90e71c93cf7f8cc69cf08c25ba5a5168b05326e7302e06d2d8613f83ce4-merged.mount: Deactivated successfully.
Dec 03 21:11:14 compute-0 podman[98698]: 2025-12-03 21:11:14.875774864 +0000 UTC m=+0.518603034 container remove ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_perlman, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:11:14 compute-0 systemd[1]: libpod-conmon-ce8432a3fe2c8ee40b6a48c6e6e63ef0b1e67f7b090b43795365506f11afa2b2.scope: Deactivated successfully.
Dec 03 21:11:14 compute-0 sudo[98609]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:14 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Dec 03 21:11:14 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Dec 03 21:11:14 compute-0 sudo[98744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:11:14 compute-0 sudo[98744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:15 compute-0 sudo[98744]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:15 compute-0 sudo[98769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:11:15 compute-0 sudo[98769]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v140: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 11 B/s, 0 objects/s recovering
Dec 03 21:11:16 compute-0 ceph-mon[75204]: 3.2 scrub starts
Dec 03 21:11:16 compute-0 ceph-mon[75204]: 3.2 scrub ok
Dec 03 21:11:16 compute-0 ceph-mon[75204]: osdmap e62: 3 total, 3 up, 3 in
Dec 03 21:11:16 compute-0 ceph-mon[75204]: 5.6 scrub starts
Dec 03 21:11:16 compute-0 ceph-mon[75204]: 5.6 scrub ok
Dec 03 21:11:16 compute-0 podman[98812]: 2025-12-03 21:11:16.085874909 +0000 UTC m=+0.038454804 container create c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=friendly_nobel, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:11:16 compute-0 systemd[1]: Started libpod-conmon-c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1.scope.
Dec 03 21:11:16 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:11:16 compute-0 podman[98812]: 2025-12-03 21:11:16.066921525 +0000 UTC m=+0.019501440 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:11:16 compute-0 podman[98812]: 2025-12-03 21:11:16.175054295 +0000 UTC m=+0.127634200 container init c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=friendly_nobel, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:11:16 compute-0 podman[98812]: 2025-12-03 21:11:16.183984572 +0000 UTC m=+0.136564497 container start c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=friendly_nobel, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:11:16 compute-0 podman[98812]: 2025-12-03 21:11:16.18812461 +0000 UTC m=+0.140704525 container attach c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=friendly_nobel, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:11:16 compute-0 friendly_nobel[98829]: 167 167
Dec 03 21:11:16 compute-0 systemd[1]: libpod-c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1.scope: Deactivated successfully.
Dec 03 21:11:16 compute-0 podman[98834]: 2025-12-03 21:11:16.241902832 +0000 UTC m=+0.034626254 container died c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=friendly_nobel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:11:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-37d486d2175e775d4438a46ffd08c8e4fcb0d52ca87650a8eea337c13bf48675-merged.mount: Deactivated successfully.
Dec 03 21:11:16 compute-0 podman[98834]: 2025-12-03 21:11:16.289772304 +0000 UTC m=+0.082495726 container remove c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=friendly_nobel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True)
Dec 03 21:11:16 compute-0 systemd[1]: libpod-conmon-c3d7c29ecabc23f59fdf4df495e5324a1f65d96f8311a57ef506a5b7e14b66b1.scope: Deactivated successfully.
Dec 03 21:11:16 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec 03 21:11:16 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec 03 21:11:16 compute-0 podman[98856]: 2025-12-03 21:11:16.480761211 +0000 UTC m=+0.038595388 container create 1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bohr, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 03 21:11:16 compute-0 systemd[1]: Started libpod-conmon-1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c.scope.
Dec 03 21:11:16 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:11:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ec9b31421c5409e2b16d8eff2753b50de52aa4aa55cdba301a5cfdd650e27be/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:16 compute-0 podman[98856]: 2025-12-03 21:11:16.465895184 +0000 UTC m=+0.023729381 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:11:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ec9b31421c5409e2b16d8eff2753b50de52aa4aa55cdba301a5cfdd650e27be/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ec9b31421c5409e2b16d8eff2753b50de52aa4aa55cdba301a5cfdd650e27be/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ec9b31421c5409e2b16d8eff2753b50de52aa4aa55cdba301a5cfdd650e27be/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:11:16 compute-0 podman[98856]: 2025-12-03 21:11:16.579472741 +0000 UTC m=+0.137306948 container init 1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bohr, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:11:16 compute-0 podman[98856]: 2025-12-03 21:11:16.589372054 +0000 UTC m=+0.147206231 container start 1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bohr, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:11:16 compute-0 podman[98856]: 2025-12-03 21:11:16.599625078 +0000 UTC m=+0.157459275 container attach 1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bohr, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle)
Dec 03 21:11:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:16 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec 03 21:11:16 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.e scrub ok
Dec 03 21:11:17 compute-0 ceph-mon[75204]: pgmap v140: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 11 B/s, 0 objects/s recovering
Dec 03 21:11:17 compute-0 ceph-mon[75204]: 3.15 scrub starts
Dec 03 21:11:17 compute-0 ceph-mon[75204]: 3.15 scrub ok
Dec 03 21:11:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v141: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 96 B/s, 0 objects/s recovering
Dec 03 21:11:17 compute-0 lvm[98953]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:11:17 compute-0 lvm[98953]: VG ceph_vg1 finished
Dec 03 21:11:17 compute-0 lvm[98954]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:11:17 compute-0 lvm[98954]: VG ceph_vg2 finished
Dec 03 21:11:17 compute-0 lvm[98950]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:11:17 compute-0 lvm[98950]: VG ceph_vg0 finished
Dec 03 21:11:17 compute-0 cool_bohr[98873]: {}
Dec 03 21:11:17 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec 03 21:11:17 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec 03 21:11:17 compute-0 systemd[1]: libpod-1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c.scope: Deactivated successfully.
Dec 03 21:11:17 compute-0 podman[98856]: 2025-12-03 21:11:17.425752785 +0000 UTC m=+0.983587002 container died 1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bohr, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:11:17 compute-0 systemd[1]: libpod-1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c.scope: Consumed 1.364s CPU time.
Dec 03 21:11:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-1ec9b31421c5409e2b16d8eff2753b50de52aa4aa55cdba301a5cfdd650e27be-merged.mount: Deactivated successfully.
Dec 03 21:11:17 compute-0 podman[98856]: 2025-12-03 21:11:17.477775026 +0000 UTC m=+1.035609223 container remove 1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bohr, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030)
Dec 03 21:11:17 compute-0 systemd[1]: libpod-conmon-1c2a95f45fcdb5b135d26477557c4bda3d5d988eb739135f28901398598e3c6c.scope: Deactivated successfully.
Dec 03 21:11:17 compute-0 sudo[98769]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:11:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:11:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:11:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:11:17 compute-0 sudo[98969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:11:17 compute-0 sudo[98969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:11:17 compute-0 sudo[98969]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:18 compute-0 ceph-mon[75204]: 5.e scrub starts
Dec 03 21:11:18 compute-0 ceph-mon[75204]: 5.e scrub ok
Dec 03 21:11:18 compute-0 ceph-mon[75204]: pgmap v141: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 96 B/s, 0 objects/s recovering
Dec 03 21:11:18 compute-0 ceph-mon[75204]: 3.17 scrub starts
Dec 03 21:11:18 compute-0 ceph-mon[75204]: 3.17 scrub ok
Dec 03 21:11:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:11:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:11:18 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec 03 21:11:18 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec 03 21:11:19 compute-0 ceph-mon[75204]: 7.1b scrub starts
Dec 03 21:11:19 compute-0 ceph-mon[75204]: 7.1b scrub ok
Dec 03 21:11:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v142: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 76 B/s, 0 objects/s recovering
Dec 03 21:11:19 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Dec 03 21:11:19 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Dec 03 21:11:20 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec 03 21:11:20 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec 03 21:11:20 compute-0 ceph-mon[75204]: pgmap v142: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 76 B/s, 0 objects/s recovering
Dec 03 21:11:20 compute-0 ceph-mon[75204]: 3.9 scrub starts
Dec 03 21:11:20 compute-0 ceph-mon[75204]: 3.9 scrub ok
Dec 03 21:11:20 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Dec 03 21:11:20 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Dec 03 21:11:20 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Dec 03 21:11:21 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Dec 03 21:11:21 compute-0 ceph-mon[75204]: 5.d scrub starts
Dec 03 21:11:21 compute-0 ceph-mon[75204]: 5.d scrub ok
Dec 03 21:11:21 compute-0 ceph-mon[75204]: 2.8 scrub starts
Dec 03 21:11:21 compute-0 ceph-mon[75204]: 2.8 scrub ok
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:11:21
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['vms', 'images', 'cephfs.cephfs.data', 'backups', 'volumes', '.mgr', 'cephfs.cephfs.meta']
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v143: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 69 B/s, 0 objects/s recovering
Dec 03 21:11:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:11:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:11:22 compute-0 ceph-mon[75204]: 5.1b scrub starts
Dec 03 21:11:22 compute-0 ceph-mon[75204]: 5.1b scrub ok
Dec 03 21:11:22 compute-0 ceph-mon[75204]: pgmap v143: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 69 B/s, 0 objects/s recovering
Dec 03 21:11:22 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Dec 03 21:11:22 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Dec 03 21:11:23 compute-0 ceph-mon[75204]: 3.6 scrub starts
Dec 03 21:11:23 compute-0 ceph-mon[75204]: 3.6 scrub ok
Dec 03 21:11:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v144: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 61 B/s, 0 objects/s recovering
Dec 03 21:11:24 compute-0 ceph-mon[75204]: pgmap v144: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 61 B/s, 0 objects/s recovering
Dec 03 21:11:24 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Dec 03 21:11:24 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Dec 03 21:11:25 compute-0 ceph-mon[75204]: 3.0 scrub starts
Dec 03 21:11:25 compute-0 ceph-mon[75204]: 3.0 scrub ok
Dec 03 21:11:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v145: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 57 B/s, 0 objects/s recovering
Dec 03 21:11:26 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec 03 21:11:26 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec 03 21:11:26 compute-0 ceph-mon[75204]: pgmap v145: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 57 B/s, 0 objects/s recovering
Dec 03 21:11:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:27 compute-0 ceph-mon[75204]: 4.1a scrub starts
Dec 03 21:11:27 compute-0 ceph-mon[75204]: 4.1a scrub ok
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v146: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 51 B/s, 0 objects/s recovering
Dec 03 21:11:27 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Dec 03 21:11:27 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:11:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:11:28 compute-0 ceph-mon[75204]: pgmap v146: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 51 B/s, 0 objects/s recovering
Dec 03 21:11:28 compute-0 ceph-mon[75204]: 7.3 scrub starts
Dec 03 21:11:28 compute-0 ceph-mon[75204]: 7.3 scrub ok
Dec 03 21:11:28 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Dec 03 21:11:28 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.0 scrub ok
Dec 03 21:11:29 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Dec 03 21:11:29 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Dec 03 21:11:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v147: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:29 compute-0 ceph-mon[75204]: 7.0 scrub starts
Dec 03 21:11:29 compute-0 ceph-mon[75204]: 7.0 scrub ok
Dec 03 21:11:30 compute-0 ceph-mon[75204]: 4.1b scrub starts
Dec 03 21:11:30 compute-0 ceph-mon[75204]: 4.1b scrub ok
Dec 03 21:11:30 compute-0 ceph-mon[75204]: pgmap v147: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:30 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Dec 03 21:11:30 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Dec 03 21:11:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v148: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:31 compute-0 ceph-mon[75204]: 2.1f scrub starts
Dec 03 21:11:31 compute-0 ceph-mon[75204]: 2.1f scrub ok
Dec 03 21:11:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:32 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.a scrub starts
Dec 03 21:11:32 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.a scrub ok
Dec 03 21:11:32 compute-0 ceph-mon[75204]: pgmap v148: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:32 compute-0 ceph-mon[75204]: 4.a scrub starts
Dec 03 21:11:32 compute-0 ceph-mon[75204]: 4.a scrub ok
Dec 03 21:11:32 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Dec 03 21:11:32 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Dec 03 21:11:33 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec 03 21:11:33 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec 03 21:11:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v149: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:33 compute-0 ceph-mon[75204]: 3.3 scrub starts
Dec 03 21:11:33 compute-0 ceph-mon[75204]: 3.3 scrub ok
Dec 03 21:11:33 compute-0 ceph-mon[75204]: 4.18 scrub starts
Dec 03 21:11:33 compute-0 ceph-mon[75204]: 4.18 scrub ok
Dec 03 21:11:34 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec 03 21:11:34 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec 03 21:11:34 compute-0 ceph-mon[75204]: pgmap v149: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:34 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec 03 21:11:34 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec 03 21:11:35 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 03 21:11:35 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 03 21:11:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v150: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:35 compute-0 ceph-mon[75204]: 3.4 scrub starts
Dec 03 21:11:35 compute-0 ceph-mon[75204]: 3.4 scrub ok
Dec 03 21:11:35 compute-0 ceph-mon[75204]: 7.13 scrub starts
Dec 03 21:11:35 compute-0 ceph-mon[75204]: 7.13 scrub ok
Dec 03 21:11:35 compute-0 ceph-mon[75204]: 4.11 scrub starts
Dec 03 21:11:35 compute-0 ceph-mon[75204]: 4.11 scrub ok
Dec 03 21:11:35 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Dec 03 21:11:35 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Dec 03 21:11:36 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Dec 03 21:11:36 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Dec 03 21:11:36 compute-0 ceph-mon[75204]: pgmap v150: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:36 compute-0 ceph-mon[75204]: 2.16 scrub starts
Dec 03 21:11:36 compute-0 ceph-mon[75204]: 2.16 scrub ok
Dec 03 21:11:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v151: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:37 compute-0 ceph-mon[75204]: 7.7 scrub starts
Dec 03 21:11:37 compute-0 ceph-mon[75204]: 7.7 scrub ok
Dec 03 21:11:38 compute-0 ceph-mon[75204]: pgmap v151: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v152: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:40 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec 03 21:11:40 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec 03 21:11:40 compute-0 ceph-mon[75204]: pgmap v152: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:40 compute-0 ceph-mon[75204]: 4.e scrub starts
Dec 03 21:11:40 compute-0 ceph-mon[75204]: 4.e scrub ok
Dec 03 21:11:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v153: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:41 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 03 21:11:41 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 03 21:11:41 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec 03 21:11:41 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec 03 21:11:41 compute-0 ceph-mon[75204]: 3.18 scrub starts
Dec 03 21:11:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:42 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec 03 21:11:42 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec 03 21:11:42 compute-0 ceph-mon[75204]: pgmap v153: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:42 compute-0 ceph-mon[75204]: 3.18 scrub ok
Dec 03 21:11:42 compute-0 ceph-mon[75204]: 7.f scrub starts
Dec 03 21:11:42 compute-0 ceph-mon[75204]: 7.f scrub ok
Dec 03 21:11:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v154: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:43 compute-0 ceph-mon[75204]: 7.1c scrub starts
Dec 03 21:11:43 compute-0 ceph-mon[75204]: 7.1c scrub ok
Dec 03 21:11:44 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec 03 21:11:44 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec 03 21:11:44 compute-0 ceph-mon[75204]: pgmap v154: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:44 compute-0 ceph-mon[75204]: 3.16 scrub starts
Dec 03 21:11:44 compute-0 ceph-mon[75204]: 3.16 scrub ok
Dec 03 21:11:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v155: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:45 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.d scrub starts
Dec 03 21:11:45 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.d scrub ok
Dec 03 21:11:46 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.f scrub starts
Dec 03 21:11:46 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.f scrub ok
Dec 03 21:11:46 compute-0 ceph-mon[75204]: pgmap v155: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:46 compute-0 ceph-mon[75204]: 7.d scrub starts
Dec 03 21:11:46 compute-0 ceph-mon[75204]: 7.d scrub ok
Dec 03 21:11:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v156: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:47 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec 03 21:11:47 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec 03 21:11:47 compute-0 ceph-mon[75204]: 2.f scrub starts
Dec 03 21:11:47 compute-0 ceph-mon[75204]: 2.f scrub ok
Dec 03 21:11:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Dec 03 21:11:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Dec 03 21:11:48 compute-0 ceph-mon[75204]: pgmap v156: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:48 compute-0 ceph-mon[75204]: 7.11 scrub starts
Dec 03 21:11:48 compute-0 ceph-mon[75204]: 7.11 scrub ok
Dec 03 21:11:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v157: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:49 compute-0 ceph-mon[75204]: 3.1c scrub starts
Dec 03 21:11:49 compute-0 ceph-mon[75204]: 3.1c scrub ok
Dec 03 21:11:50 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec 03 21:11:50 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec 03 21:11:50 compute-0 ceph-mon[75204]: pgmap v157: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v158: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:51 compute-0 ceph-mon[75204]: 5.5 scrub starts
Dec 03 21:11:51 compute-0 ceph-mon[75204]: 5.5 scrub ok
Dec 03 21:11:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:11:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:11:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:11:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:11:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:11:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:11:52 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 03 21:11:52 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Dec 03 21:11:52 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 03 21:11:52 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Dec 03 21:11:52 compute-0 ceph-mon[75204]: pgmap v158: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v159: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:53 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.d scrub starts
Dec 03 21:11:53 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.d scrub ok
Dec 03 21:11:53 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Dec 03 21:11:53 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Dec 03 21:11:53 compute-0 ceph-mon[75204]: 7.6 scrub starts
Dec 03 21:11:53 compute-0 ceph-mon[75204]: 7.19 scrub starts
Dec 03 21:11:53 compute-0 ceph-mon[75204]: 7.6 scrub ok
Dec 03 21:11:53 compute-0 ceph-mon[75204]: 7.19 scrub ok
Dec 03 21:11:54 compute-0 ceph-mon[75204]: pgmap v159: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:54 compute-0 ceph-mon[75204]: 4.d scrub starts
Dec 03 21:11:54 compute-0 ceph-mon[75204]: 4.d scrub ok
Dec 03 21:11:54 compute-0 ceph-mon[75204]: 4.1 scrub starts
Dec 03 21:11:54 compute-0 ceph-mon[75204]: 4.1 scrub ok
Dec 03 21:11:54 compute-0 sudo[98355]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v160: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:55 compute-0 sudo[99222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcanlrbnzrpnmunyqrulqpjmqzrfnbby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796314.9327745-137-77960004176785/AnsiballZ_command.py'
Dec 03 21:11:55 compute-0 sudo[99222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:11:55 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 03 21:11:55 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 03 21:11:55 compute-0 python3.9[99224]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:11:56 compute-0 sudo[99222]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:56 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec 03 21:11:56 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec 03 21:11:56 compute-0 ceph-mon[75204]: pgmap v160: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:56 compute-0 ceph-mon[75204]: 7.a scrub starts
Dec 03 21:11:56 compute-0 ceph-mon[75204]: 7.a scrub ok
Dec 03 21:11:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:11:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v161: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:57 compute-0 sudo[99509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeakbmvtbnqcxihveggnltnzploohmrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796316.5322852-145-167353068272869/AnsiballZ_selinux.py'
Dec 03 21:11:57 compute-0 sudo[99509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:11:57 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec 03 21:11:57 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec 03 21:11:57 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Dec 03 21:11:57 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Dec 03 21:11:57 compute-0 ceph-mon[75204]: 3.11 scrub starts
Dec 03 21:11:57 compute-0 ceph-mon[75204]: 3.11 scrub ok
Dec 03 21:11:57 compute-0 python3.9[99511]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 03 21:11:57 compute-0 sudo[99509]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:58 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Dec 03 21:11:58 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Dec 03 21:11:58 compute-0 sudo[99661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwvdoyhtehaojtqzwmllwzjqcgegpgqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796317.9259088-156-179659459675532/AnsiballZ_command.py'
Dec 03 21:11:58 compute-0 sudo[99661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:11:58 compute-0 ceph-mon[75204]: pgmap v161: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:58 compute-0 ceph-mon[75204]: 7.15 scrub starts
Dec 03 21:11:58 compute-0 ceph-mon[75204]: 7.15 scrub ok
Dec 03 21:11:58 compute-0 ceph-mon[75204]: 4.2 scrub starts
Dec 03 21:11:58 compute-0 ceph-mon[75204]: 4.2 scrub ok
Dec 03 21:11:58 compute-0 python3.9[99663]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 03 21:11:58 compute-0 sudo[99661]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:59 compute-0 sudo[99813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjdmixuwcllytbfpberzbcpvypibkfqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796318.7437966-164-155236089457823/AnsiballZ_file.py'
Dec 03 21:11:59 compute-0 sudo[99813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:11:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v162: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:11:59 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec 03 21:11:59 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec 03 21:11:59 compute-0 python3.9[99815]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:11:59 compute-0 sudo[99813]: pam_unix(sudo:session): session closed for user root
Dec 03 21:11:59 compute-0 ceph-mon[75204]: 2.2 scrub starts
Dec 03 21:11:59 compute-0 ceph-mon[75204]: 2.2 scrub ok
Dec 03 21:12:00 compute-0 sudo[99965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sirtbwstysfuzttptcpdbeotcqtpqkfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796319.589257-172-51114967479253/AnsiballZ_mount.py'
Dec 03 21:12:00 compute-0 sudo[99965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:00 compute-0 python3.9[99967]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 03 21:12:00 compute-0 sudo[99965]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:00 compute-0 ceph-mon[75204]: pgmap v162: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:00 compute-0 ceph-mon[75204]: 3.e scrub starts
Dec 03 21:12:00 compute-0 ceph-mon[75204]: 3.e scrub ok
Dec 03 21:12:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v163: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:01 compute-0 sudo[100117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztvzfpirzkxqapakuacfqsjbjnytvpmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796321.277521-200-168402356270914/AnsiballZ_file.py'
Dec 03 21:12:01 compute-0 sudo[100117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:01 compute-0 python3.9[100119]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:12:01 compute-0 sudo[100117]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:02 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 03 21:12:02 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec 03 21:12:02 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec 03 21:12:02 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec 03 21:12:02 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec 03 21:12:02 compute-0 ceph-mon[75204]: pgmap v163: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:02 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec 03 21:12:02 compute-0 sudo[100269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntmdivxrzboamvinpzljdviguoghqccb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796322.1581333-208-48308355946134/AnsiballZ_stat.py'
Dec 03 21:12:02 compute-0 sudo[100269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:02 compute-0 python3.9[100271]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:12:02 compute-0 sudo[100269]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:03 compute-0 sudo[100347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smyeodasofocqujoagpjviozbjcwgxug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796322.1581333-208-48308355946134/AnsiballZ_file.py'
Dec 03 21:12:03 compute-0 sudo[100347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v164: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:03 compute-0 python3.9[100349]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:12:03 compute-0 sudo[100347]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:03 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec 03 21:12:03 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec 03 21:12:03 compute-0 ceph-mon[75204]: 5.3 scrub starts
Dec 03 21:12:03 compute-0 ceph-mon[75204]: 5.3 scrub ok
Dec 03 21:12:03 compute-0 ceph-mon[75204]: 7.8 scrub starts
Dec 03 21:12:03 compute-0 ceph-mon[75204]: 7.8 scrub ok
Dec 03 21:12:03 compute-0 ceph-mon[75204]: 4.14 scrub starts
Dec 03 21:12:03 compute-0 ceph-mon[75204]: 4.14 scrub ok
Dec 03 21:12:04 compute-0 sudo[100499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thslvsepbwvfzgnspsjekibjwbanvxnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796323.8552287-229-279250400546202/AnsiballZ_stat.py'
Dec 03 21:12:04 compute-0 sudo[100499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:04 compute-0 python3.9[100501]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:12:04 compute-0 sudo[100499]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:04 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 03 21:12:04 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec 03 21:12:04 compute-0 ceph-mon[75204]: pgmap v164: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:04 compute-0 ceph-mon[75204]: 4.12 scrub starts
Dec 03 21:12:04 compute-0 ceph-mon[75204]: 4.12 scrub ok
Dec 03 21:12:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v165: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:05 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Dec 03 21:12:05 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Dec 03 21:12:05 compute-0 sudo[100653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwshwfkcbawyriyidrtqxxddfojcpwqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796324.8986132-242-181057496256548/AnsiballZ_getent.py'
Dec 03 21:12:05 compute-0 sudo[100653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:05 compute-0 ceph-mon[75204]: 7.5 scrub starts
Dec 03 21:12:05 compute-0 ceph-mon[75204]: 7.5 scrub ok
Dec 03 21:12:05 compute-0 python3.9[100655]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 03 21:12:05 compute-0 sudo[100653]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:06 compute-0 sudo[100806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnkqkxglznantrneabyuidetujtcgrot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796325.9708252-252-219943847447680/AnsiballZ_getent.py'
Dec 03 21:12:06 compute-0 sudo[100806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:06 compute-0 ceph-mon[75204]: pgmap v165: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:06 compute-0 ceph-mon[75204]: 7.2 scrub starts
Dec 03 21:12:06 compute-0 ceph-mon[75204]: 7.2 scrub ok
Dec 03 21:12:06 compute-0 python3.9[100808]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 03 21:12:06 compute-0 sudo[100806]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v166: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:07 compute-0 sudo[100959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnvemuygbriesilithobptxuedttqvrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796326.8110464-260-116976274107515/AnsiballZ_group.py'
Dec 03 21:12:07 compute-0 sudo[100959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:07 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec 03 21:12:07 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec 03 21:12:07 compute-0 python3.9[100961]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 03 21:12:07 compute-0 sudo[100959]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:07 compute-0 ceph-mon[75204]: pgmap v166: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:08 compute-0 sudo[101111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhhddhddxmnpvkewbhaebartbmimdrvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796327.7357812-269-12485070125590/AnsiballZ_file.py'
Dec 03 21:12:08 compute-0 sudo[101111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:08 compute-0 python3.9[101113]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 03 21:12:08 compute-0 sudo[101111]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:08 compute-0 ceph-mon[75204]: 7.1 scrub starts
Dec 03 21:12:08 compute-0 ceph-mon[75204]: 7.1 scrub ok
Dec 03 21:12:09 compute-0 sudo[101263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgsftmvidoyehxzqotkamuukltlqujve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796328.6678336-280-261880304356072/AnsiballZ_dnf.py'
Dec 03 21:12:09 compute-0 sudo[101263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v167: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:09 compute-0 python3.9[101265]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:12:09 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 03 21:12:09 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 03 21:12:09 compute-0 ceph-mon[75204]: pgmap v167: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:10 compute-0 sudo[101263]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:10 compute-0 ceph-mon[75204]: 7.9 scrub starts
Dec 03 21:12:10 compute-0 ceph-mon[75204]: 7.9 scrub ok
Dec 03 21:12:10 compute-0 sudo[101416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihkvoellpuponcnlyylsfrjesvucwoeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796330.6739435-288-171651427035931/AnsiballZ_file.py'
Dec 03 21:12:10 compute-0 sudo[101416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:11 compute-0 python3.9[101418]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:12:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v168: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:11 compute-0 sudo[101416]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:11 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Dec 03 21:12:11 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Dec 03 21:12:11 compute-0 ceph-mon[75204]: pgmap v168: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:11 compute-0 ceph-mon[75204]: 2.1b scrub starts
Dec 03 21:12:11 compute-0 ceph-mon[75204]: 2.1b scrub ok
Dec 03 21:12:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:11 compute-0 sudo[101569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfvwwwuundrhvyekzimkyaisvzsvqbyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796331.4086957-296-167079891003141/AnsiballZ_stat.py'
Dec 03 21:12:11 compute-0 sudo[101569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:11 compute-0 python3.9[101571]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:12:12 compute-0 sudo[101569]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:12 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec 03 21:12:12 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec 03 21:12:12 compute-0 sudo[101647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wercninhfjtbjwtxyoflxstpucthhywm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796331.4086957-296-167079891003141/AnsiballZ_file.py'
Dec 03 21:12:12 compute-0 sudo[101647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:12 compute-0 python3.9[101649]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:12:12 compute-0 sudo[101647]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:13 compute-0 ceph-mon[75204]: 5.4 scrub starts
Dec 03 21:12:13 compute-0 ceph-mon[75204]: 5.4 scrub ok
Dec 03 21:12:13 compute-0 sudo[101799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxnwpyolnsuafdjmipopuzekiizguayt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796332.812629-309-192837443277206/AnsiballZ_stat.py'
Dec 03 21:12:13 compute-0 sudo[101799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v169: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:13 compute-0 python3.9[101801]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:12:13 compute-0 sudo[101799]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:13 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec 03 21:12:13 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec 03 21:12:13 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec 03 21:12:13 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec 03 21:12:13 compute-0 sudo[101877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xasyicgmzckkwpdqbgcnjaismoxohkwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796332.812629-309-192837443277206/AnsiballZ_file.py'
Dec 03 21:12:13 compute-0 sudo[101877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:13 compute-0 python3.9[101879]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:12:13 compute-0 sudo[101877]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:14 compute-0 ceph-mon[75204]: pgmap v169: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:14 compute-0 ceph-mon[75204]: 5.11 scrub starts
Dec 03 21:12:14 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec 03 21:12:14 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec 03 21:12:15 compute-0 sudo[102029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifyxhsesmkmpbekhfzaishqiiioeypwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796334.301068-324-9625493173240/AnsiballZ_dnf.py'
Dec 03 21:12:15 compute-0 sudo[102029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:15 compute-0 ceph-mon[75204]: 3.7 scrub starts
Dec 03 21:12:15 compute-0 ceph-mon[75204]: 3.7 scrub ok
Dec 03 21:12:15 compute-0 ceph-mon[75204]: 5.11 scrub ok
Dec 03 21:12:15 compute-0 ceph-mon[75204]: 7.18 scrub starts
Dec 03 21:12:15 compute-0 ceph-mon[75204]: 7.18 scrub ok
Dec 03 21:12:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v170: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:15 compute-0 python3.9[102031]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:12:15 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec 03 21:12:15 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec 03 21:12:16 compute-0 ceph-mon[75204]: pgmap v170: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:16 compute-0 ceph-mon[75204]: 4.4 scrub starts
Dec 03 21:12:16 compute-0 ceph-mon[75204]: 4.4 scrub ok
Dec 03 21:12:16 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.c scrub starts
Dec 03 21:12:16 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.c scrub ok
Dec 03 21:12:16 compute-0 sudo[102029]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:16 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 03 21:12:16 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 03 21:12:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:17 compute-0 ceph-mon[75204]: 4.10 scrub starts
Dec 03 21:12:17 compute-0 ceph-mon[75204]: 4.10 scrub ok
Dec 03 21:12:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v171: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:17 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.c scrub starts
Dec 03 21:12:17 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.c scrub ok
Dec 03 21:12:17 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec 03 21:12:17 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec 03 21:12:17 compute-0 python3.9[102182]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:12:17 compute-0 sudo[102232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:12:17 compute-0 sudo[102232]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:17 compute-0 sudo[102232]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:17 compute-0 sudo[102286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:12:17 compute-0 sudo[102286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:18 compute-0 ceph-mon[75204]: 7.c scrub starts
Dec 03 21:12:18 compute-0 ceph-mon[75204]: 7.c scrub ok
Dec 03 21:12:18 compute-0 ceph-mon[75204]: pgmap v171: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:18 compute-0 ceph-mon[75204]: 3.c scrub starts
Dec 03 21:12:18 compute-0 ceph-mon[75204]: 3.c scrub ok
Dec 03 21:12:18 compute-0 python3.9[102398]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 03 21:12:18 compute-0 sudo[102286]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:12:18 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:12:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:12:18 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:12:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:12:18 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:12:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:12:18 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:12:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:12:18 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:12:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:12:18 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:12:18 compute-0 sudo[102439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:12:18 compute-0 sudo[102439]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:18 compute-0 sudo[102439]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:18 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec 03 21:12:18 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec 03 21:12:18 compute-0 sudo[102482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:12:18 compute-0 sudo[102482]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:18 compute-0 podman[102606]: 2025-12-03 21:12:18.805201772 +0000 UTC m=+0.049743817 container create c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_hypatia, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:12:18 compute-0 systemd[1]: Started libpod-conmon-c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860.scope.
Dec 03 21:12:18 compute-0 podman[102606]: 2025-12-03 21:12:18.782861851 +0000 UTC m=+0.027403886 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:12:18 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:12:18 compute-0 podman[102606]: 2025-12-03 21:12:18.903703355 +0000 UTC m=+0.148245380 container init c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_hypatia, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:12:18 compute-0 podman[102606]: 2025-12-03 21:12:18.910234943 +0000 UTC m=+0.154776948 container start c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_hypatia, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:12:18 compute-0 podman[102606]: 2025-12-03 21:12:18.913311982 +0000 UTC m=+0.157853987 container attach c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_hypatia, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:12:18 compute-0 nervous_hypatia[102642]: 167 167
Dec 03 21:12:18 compute-0 systemd[1]: libpod-c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860.scope: Deactivated successfully.
Dec 03 21:12:18 compute-0 conmon[102642]: conmon c259561a95386af7e842 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860.scope/container/memory.events
Dec 03 21:12:18 compute-0 podman[102606]: 2025-12-03 21:12:18.916793981 +0000 UTC m=+0.161335996 container died c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_hypatia, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:12:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-41856caa5538d07d43fc7635daa27bbcefcdd62f140e3bdc2e61377c39bd2c13-merged.mount: Deactivated successfully.
Dec 03 21:12:18 compute-0 podman[102606]: 2025-12-03 21:12:18.953894725 +0000 UTC m=+0.198436740 container remove c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_hypatia, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:12:18 compute-0 systemd[1]: libpod-conmon-c259561a95386af7e842b396475bac578ce8f8fb1e9db0cd3547a043bcbab860.scope: Deactivated successfully.
Dec 03 21:12:19 compute-0 python3.9[102639]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:12:19 compute-0 podman[102687]: 2025-12-03 21:12:19.14311949 +0000 UTC m=+0.049030527 container create 9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=intelligent_edison, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:12:19 compute-0 ceph-mon[75204]: 3.1d scrub starts
Dec 03 21:12:19 compute-0 ceph-mon[75204]: 3.1d scrub ok
Dec 03 21:12:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:12:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:12:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:12:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:12:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:12:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:12:19 compute-0 systemd[1]: Started libpod-conmon-9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4.scope.
Dec 03 21:12:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v172: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:19 compute-0 podman[102687]: 2025-12-03 21:12:19.120968825 +0000 UTC m=+0.026879902 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:12:19 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e8dcb95bbbca97bcf11db1615a5464486cb099702bc1b27c36931006bd52186/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e8dcb95bbbca97bcf11db1615a5464486cb099702bc1b27c36931006bd52186/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e8dcb95bbbca97bcf11db1615a5464486cb099702bc1b27c36931006bd52186/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e8dcb95bbbca97bcf11db1615a5464486cb099702bc1b27c36931006bd52186/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e8dcb95bbbca97bcf11db1615a5464486cb099702bc1b27c36931006bd52186/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:19 compute-0 podman[102687]: 2025-12-03 21:12:19.231003011 +0000 UTC m=+0.136914078 container init 9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=intelligent_edison, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:12:19 compute-0 podman[102687]: 2025-12-03 21:12:19.240990487 +0000 UTC m=+0.146901524 container start 9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=intelligent_edison, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 03 21:12:19 compute-0 podman[102687]: 2025-12-03 21:12:19.244614583 +0000 UTC m=+0.150525620 container attach 9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=intelligent_edison, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:12:19 compute-0 intelligent_edison[102706]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:12:19 compute-0 intelligent_edison[102706]: --> All data devices are unavailable
Dec 03 21:12:19 compute-0 systemd[1]: libpod-9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4.scope: Deactivated successfully.
Dec 03 21:12:19 compute-0 podman[102687]: 2025-12-03 21:12:19.772009709 +0000 UTC m=+0.677920746 container died 9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=intelligent_edison, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:12:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-8e8dcb95bbbca97bcf11db1615a5464486cb099702bc1b27c36931006bd52186-merged.mount: Deactivated successfully.
Dec 03 21:12:19 compute-0 podman[102687]: 2025-12-03 21:12:19.815543286 +0000 UTC m=+0.721454313 container remove 9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=intelligent_edison, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:12:19 compute-0 systemd[1]: libpod-conmon-9319acf5684c010917579d21408162f1ca5763723937156aa69c93abb894ebf4.scope: Deactivated successfully.
Dec 03 21:12:19 compute-0 sudo[102482]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:19 compute-0 sudo[102838]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:12:19 compute-0 sudo[102838]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:19 compute-0 sudo[102838]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:19 compute-0 sudo[102887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qypmjusqbowlsycarsntvnuyelhnegmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796339.362093-365-34376104848856/AnsiballZ_systemd.py'
Dec 03 21:12:19 compute-0 sudo[102887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:20 compute-0 sudo[102892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:12:20 compute-0 sudo[102892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:20 compute-0 ceph-mon[75204]: 7.1a scrub starts
Dec 03 21:12:20 compute-0 ceph-mon[75204]: 7.1a scrub ok
Dec 03 21:12:20 compute-0 ceph-mon[75204]: pgmap v172: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:20 compute-0 python3.9[102891]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:12:20 compute-0 podman[102928]: 2025-12-03 21:12:20.329912426 +0000 UTC m=+0.058743741 container create b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rosalind, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:12:20 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 03 21:12:20 compute-0 systemd[1]: Started libpod-conmon-b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819.scope.
Dec 03 21:12:20 compute-0 podman[102928]: 2025-12-03 21:12:20.305180239 +0000 UTC m=+0.034011544 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:12:20 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:12:20 compute-0 podman[102928]: 2025-12-03 21:12:20.425259639 +0000 UTC m=+0.154090974 container init b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rosalind, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 03 21:12:20 compute-0 podman[102928]: 2025-12-03 21:12:20.433354663 +0000 UTC m=+0.162185958 container start b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rosalind, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:12:20 compute-0 podman[102928]: 2025-12-03 21:12:20.437464512 +0000 UTC m=+0.166295807 container attach b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rosalind, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2)
Dec 03 21:12:20 compute-0 cool_rosalind[102950]: 167 167
Dec 03 21:12:20 compute-0 systemd[1]: libpod-b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819.scope: Deactivated successfully.
Dec 03 21:12:20 compute-0 podman[102928]: 2025-12-03 21:12:20.43924645 +0000 UTC m=+0.168077745 container died b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rosalind, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:12:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-c8971bb5d4e2a5298f36d2341aebe64cfcc8ec9bab2bf2a7dadd8d2a08b2dce1-merged.mount: Deactivated successfully.
Dec 03 21:12:20 compute-0 podman[102928]: 2025-12-03 21:12:20.497169198 +0000 UTC m=+0.226000463 container remove b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rosalind, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:12:20 compute-0 systemd[1]: libpod-conmon-b2f8ad000549e2bec096e9048ad59cbf327f04ba411d710f3a0c56b6b8698819.scope: Deactivated successfully.
Dec 03 21:12:20 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 03 21:12:20 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 03 21:12:20 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 03 21:12:20 compute-0 podman[102979]: 2025-12-03 21:12:20.718197698 +0000 UTC m=+0.055922136 container create 11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_mclean, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:12:20 compute-0 systemd[1]: Started libpod-conmon-11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a.scope.
Dec 03 21:12:20 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 03 21:12:20 compute-0 podman[102979]: 2025-12-03 21:12:20.686198318 +0000 UTC m=+0.023922776 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:12:20 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edf480ccc4f5da821f4c1935c83ef5369bf21fcae9396657b2d68031f7dcca1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edf480ccc4f5da821f4c1935c83ef5369bf21fcae9396657b2d68031f7dcca1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edf480ccc4f5da821f4c1935c83ef5369bf21fcae9396657b2d68031f7dcca1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8edf480ccc4f5da821f4c1935c83ef5369bf21fcae9396657b2d68031f7dcca1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:20 compute-0 sudo[102887]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:20 compute-0 podman[102979]: 2025-12-03 21:12:20.829118954 +0000 UTC m=+0.166843472 container init 11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_mclean, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:12:20 compute-0 podman[102979]: 2025-12-03 21:12:20.845324305 +0000 UTC m=+0.183048763 container start 11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_mclean, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:12:20 compute-0 podman[102979]: 2025-12-03 21:12:20.930018554 +0000 UTC m=+0.267743052 container attach 11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_mclean, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]: {
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:     "0": [
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:         {
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "devices": [
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "/dev/loop3"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             ],
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_name": "ceph_lv0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_size": "21470642176",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "name": "ceph_lv0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "tags": {
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cluster_name": "ceph",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.crush_device_class": "",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.encrypted": "0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.objectstore": "bluestore",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osd_id": "0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.type": "block",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.vdo": "0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.with_tpm": "0"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             },
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "type": "block",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "vg_name": "ceph_vg0"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:         }
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:     ],
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:     "1": [
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:         {
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "devices": [
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "/dev/loop4"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             ],
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_name": "ceph_lv1",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_size": "21470642176",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "name": "ceph_lv1",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "tags": {
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cluster_name": "ceph",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.crush_device_class": "",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.encrypted": "0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.objectstore": "bluestore",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osd_id": "1",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.type": "block",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.vdo": "0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.with_tpm": "0"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             },
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "type": "block",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "vg_name": "ceph_vg1"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:         }
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:     ],
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:     "2": [
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:         {
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "devices": [
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "/dev/loop5"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             ],
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_name": "ceph_lv2",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_size": "21470642176",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "name": "ceph_lv2",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "tags": {
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.cluster_name": "ceph",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.crush_device_class": "",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.encrypted": "0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.objectstore": "bluestore",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osd_id": "2",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.type": "block",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.vdo": "0",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:                 "ceph.with_tpm": "0"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             },
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "type": "block",
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:             "vg_name": "ceph_vg2"
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:         }
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]:     ]
Dec 03 21:12:21 compute-0 nostalgic_mclean[102999]: }
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:12:21
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['images', 'backups', 'volumes', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'vms', '.mgr']
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:12:21 compute-0 systemd[1]: libpod-11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a.scope: Deactivated successfully.
Dec 03 21:12:21 compute-0 podman[102979]: 2025-12-03 21:12:21.201964646 +0000 UTC m=+0.539689104 container died 11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_mclean, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v173: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-8edf480ccc4f5da821f4c1935c83ef5369bf21fcae9396657b2d68031f7dcca1-merged.mount: Deactivated successfully.
Dec 03 21:12:21 compute-0 podman[102979]: 2025-12-03 21:12:21.250089284 +0000 UTC m=+0.587813702 container remove 11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_mclean, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:12:21 compute-0 systemd[1]: libpod-conmon-11ba17ed3cb74bae427fc989755162f00f4b2a9dfcded4fffdbcf05b19e01f5a.scope: Deactivated successfully.
Dec 03 21:12:21 compute-0 sudo[102892]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:21 compute-0 sudo[103144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:12:21 compute-0 sudo[103144]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:21 compute-0 sudo[103144]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:21 compute-0 sudo[103196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:12:21 compute-0 sudo[103196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:21 compute-0 python3.9[103193]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:12:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:12:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:12:21 compute-0 podman[103258]: 2025-12-03 21:12:21.825033323 +0000 UTC m=+0.056965224 container create 8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_chatelet, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:12:21 compute-0 systemd[1]: Started libpod-conmon-8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e.scope.
Dec 03 21:12:21 compute-0 podman[103258]: 2025-12-03 21:12:21.797858552 +0000 UTC m=+0.029790473 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:12:21 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:12:21 compute-0 podman[103258]: 2025-12-03 21:12:21.931543781 +0000 UTC m=+0.163475742 container init 8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:12:21 compute-0 podman[103258]: 2025-12-03 21:12:21.944149697 +0000 UTC m=+0.176081608 container start 8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_chatelet, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:12:21 compute-0 podman[103258]: 2025-12-03 21:12:21.947727161 +0000 UTC m=+0.179659042 container attach 8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_chatelet, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 03 21:12:21 compute-0 modest_chatelet[103274]: 167 167
Dec 03 21:12:21 compute-0 systemd[1]: libpod-8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e.scope: Deactivated successfully.
Dec 03 21:12:21 compute-0 podman[103258]: 2025-12-03 21:12:21.952382975 +0000 UTC m=+0.184314916 container died 8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_chatelet, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:12:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-8361d89ff0e0a4ecca4ea17bc885ebb22322aef83cc9a0931e2da2205335a6ea-merged.mount: Deactivated successfully.
Dec 03 21:12:21 compute-0 podman[103258]: 2025-12-03 21:12:21.999792594 +0000 UTC m=+0.231724475 container remove 8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_chatelet, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:12:22 compute-0 systemd[1]: libpod-conmon-8a654df33291ecdfca283720f51f17875dbba47778e55380a93399cecf3eee1e.scope: Deactivated successfully.
Dec 03 21:12:22 compute-0 podman[103299]: 2025-12-03 21:12:22.217127636 +0000 UTC m=+0.050505223 container create 2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_satoshi, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030)
Dec 03 21:12:22 compute-0 systemd[1]: Started libpod-conmon-2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd.scope.
Dec 03 21:12:22 compute-0 ceph-mon[75204]: pgmap v173: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:22 compute-0 podman[103299]: 2025-12-03 21:12:22.197213817 +0000 UTC m=+0.030591394 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:12:22 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aba2d6f6a6d39b710f7f32ccf3d28f1e84c02cf77e605d36bdc1f7e6edd6d33/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aba2d6f6a6d39b710f7f32ccf3d28f1e84c02cf77e605d36bdc1f7e6edd6d33/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aba2d6f6a6d39b710f7f32ccf3d28f1e84c02cf77e605d36bdc1f7e6edd6d33/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7aba2d6f6a6d39b710f7f32ccf3d28f1e84c02cf77e605d36bdc1f7e6edd6d33/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:12:22 compute-0 podman[103299]: 2025-12-03 21:12:22.32422003 +0000 UTC m=+0.157597637 container init 2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_satoshi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:12:22 compute-0 podman[103299]: 2025-12-03 21:12:22.339493446 +0000 UTC m=+0.172871003 container start 2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_satoshi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:12:22 compute-0 podman[103299]: 2025-12-03 21:12:22.343844551 +0000 UTC m=+0.177222118 container attach 2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_satoshi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:12:22 compute-0 lvm[103394]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:12:22 compute-0 lvm[103394]: VG ceph_vg1 finished
Dec 03 21:12:22 compute-0 lvm[103393]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:12:22 compute-0 lvm[103393]: VG ceph_vg0 finished
Dec 03 21:12:23 compute-0 lvm[103396]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:12:23 compute-0 lvm[103396]: VG ceph_vg2 finished
Dec 03 21:12:23 compute-0 dazzling_satoshi[103315]: {}
Dec 03 21:12:23 compute-0 systemd[1]: libpod-2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd.scope: Deactivated successfully.
Dec 03 21:12:23 compute-0 podman[103299]: 2025-12-03 21:12:23.156908285 +0000 UTC m=+0.990285872 container died 2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_satoshi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 03 21:12:23 compute-0 systemd[1]: libpod-2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd.scope: Consumed 1.315s CPU time.
Dec 03 21:12:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-7aba2d6f6a6d39b710f7f32ccf3d28f1e84c02cf77e605d36bdc1f7e6edd6d33-merged.mount: Deactivated successfully.
Dec 03 21:12:23 compute-0 podman[103299]: 2025-12-03 21:12:23.20871215 +0000 UTC m=+1.042089707 container remove 2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_satoshi, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:12:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v174: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:23 compute-0 systemd[1]: libpod-conmon-2818224afdf5dc2624219eefeb862ade009c4694febb48ad69987dd10ba038bd.scope: Deactivated successfully.
Dec 03 21:12:23 compute-0 sudo[103196]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:12:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:12:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:12:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:12:23 compute-0 sudo[103410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:12:23 compute-0 sudo[103410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:12:23 compute-0 sudo[103410]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:23 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Dec 03 21:12:23 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Dec 03 21:12:23 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec 03 21:12:23 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec 03 21:12:23 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec 03 21:12:23 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec 03 21:12:23 compute-0 sudo[103560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syywczrtnltdfzihlizjvxbanghluwta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796343.5158653-422-114810950780715/AnsiballZ_systemd.py'
Dec 03 21:12:23 compute-0 sudo[103560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:24 compute-0 python3.9[103562]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:12:24 compute-0 sudo[103560]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:24 compute-0 ceph-mon[75204]: pgmap v174: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:12:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:12:24 compute-0 ceph-mon[75204]: 5.7 scrub starts
Dec 03 21:12:24 compute-0 ceph-mon[75204]: 5.7 scrub ok
Dec 03 21:12:24 compute-0 ceph-mon[75204]: 4.13 scrub starts
Dec 03 21:12:24 compute-0 ceph-mon[75204]: 4.13 scrub ok
Dec 03 21:12:24 compute-0 ceph-mon[75204]: 2.17 scrub starts
Dec 03 21:12:24 compute-0 ceph-mon[75204]: 2.17 scrub ok
Dec 03 21:12:24 compute-0 sudo[103714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnuoxrxebdbwvpztrramfzeezlvpmfjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796344.3572352-422-26851441568129/AnsiballZ_systemd.py'
Dec 03 21:12:24 compute-0 sudo[103714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:25 compute-0 python3.9[103716]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:12:25 compute-0 sudo[103714]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v175: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:25 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec 03 21:12:25 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec 03 21:12:25 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec 03 21:12:25 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec 03 21:12:25 compute-0 sshd-session[96393]: Connection closed by 192.168.122.30 port 59376
Dec 03 21:12:25 compute-0 sshd-session[96390]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:12:25 compute-0 systemd-logind[787]: Session 34 logged out. Waiting for processes to exit.
Dec 03 21:12:25 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Dec 03 21:12:25 compute-0 systemd[1]: session-34.scope: Consumed 1min 7.887s CPU time.
Dec 03 21:12:25 compute-0 systemd-logind[787]: Removed session 34.
Dec 03 21:12:26 compute-0 ceph-mon[75204]: pgmap v175: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:26 compute-0 ceph-mon[75204]: 7.4 scrub starts
Dec 03 21:12:26 compute-0 ceph-mon[75204]: 7.4 scrub ok
Dec 03 21:12:26 compute-0 ceph-mon[75204]: 3.5 scrub starts
Dec 03 21:12:26 compute-0 ceph-mon[75204]: 3.5 scrub ok
Dec 03 21:12:26 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 03 21:12:26 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 03 21:12:26 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec 03 21:12:26 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec 03 21:12:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v176: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:27 compute-0 ceph-mon[75204]: 7.e scrub starts
Dec 03 21:12:27 compute-0 ceph-mon[75204]: 7.e scrub ok
Dec 03 21:12:27 compute-0 ceph-mon[75204]: 4.f scrub starts
Dec 03 21:12:27 compute-0 ceph-mon[75204]: 4.f scrub ok
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:12:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:12:28 compute-0 ceph-mon[75204]: pgmap v176: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v177: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:29 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Dec 03 21:12:29 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Dec 03 21:12:29 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec 03 21:12:29 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec 03 21:12:29 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec 03 21:12:29 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec 03 21:12:30 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec 03 21:12:30 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec 03 21:12:30 compute-0 ceph-mon[75204]: pgmap v177: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:30 compute-0 ceph-mon[75204]: 5.2 scrub starts
Dec 03 21:12:30 compute-0 ceph-mon[75204]: 5.2 scrub ok
Dec 03 21:12:30 compute-0 ceph-mon[75204]: 6.8 scrub starts
Dec 03 21:12:30 compute-0 ceph-mon[75204]: 6.8 scrub ok
Dec 03 21:12:30 compute-0 ceph-mon[75204]: 2.15 scrub starts
Dec 03 21:12:30 compute-0 ceph-mon[75204]: 2.15 scrub ok
Dec 03 21:12:30 compute-0 sshd-session[103743]: Accepted publickey for zuul from 192.168.122.30 port 38166 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:12:30 compute-0 systemd-logind[787]: New session 35 of user zuul.
Dec 03 21:12:30 compute-0 systemd[1]: Started Session 35 of User zuul.
Dec 03 21:12:30 compute-0 sshd-session[103743]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:12:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v178: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:31 compute-0 ceph-mon[75204]: 7.1f scrub starts
Dec 03 21:12:31 compute-0 ceph-mon[75204]: 7.1f scrub ok
Dec 03 21:12:31 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec 03 21:12:31 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec 03 21:12:31 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec 03 21:12:31 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec 03 21:12:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:31 compute-0 python3.9[103896]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:12:32 compute-0 ceph-mon[75204]: pgmap v178: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:32 compute-0 ceph-mon[75204]: 3.f scrub starts
Dec 03 21:12:32 compute-0 ceph-mon[75204]: 3.f scrub ok
Dec 03 21:12:32 compute-0 ceph-mon[75204]: 6.f scrub starts
Dec 03 21:12:32 compute-0 ceph-mon[75204]: 6.f scrub ok
Dec 03 21:12:32 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 03 21:12:32 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec 03 21:12:32 compute-0 sudo[104050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxcmfzurznhxtexixsqeshhvtrxgevkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796352.4691296-36-275493549986321/AnsiballZ_getent.py'
Dec 03 21:12:32 compute-0 sudo[104050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:33 compute-0 python3.9[104052]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 03 21:12:33 compute-0 sudo[104050]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v179: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:33 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec 03 21:12:33 compute-0 ceph-mon[75204]: 5.13 scrub starts
Dec 03 21:12:33 compute-0 ceph-mon[75204]: 5.13 scrub ok
Dec 03 21:12:33 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec 03 21:12:33 compute-0 sudo[104203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhgtjxuwfmbqyagtcehjtlganefohysz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796353.576423-48-98845107259089/AnsiballZ_setup.py'
Dec 03 21:12:33 compute-0 sudo[104203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:34 compute-0 python3.9[104205]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:12:34 compute-0 ceph-mon[75204]: pgmap v179: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:34 compute-0 ceph-mon[75204]: 3.1 scrub starts
Dec 03 21:12:34 compute-0 ceph-mon[75204]: 3.1 scrub ok
Dec 03 21:12:34 compute-0 sudo[104203]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:35 compute-0 sudo[104287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgromafuwnjqosaxiovbxbghwqnwtsqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796353.576423-48-98845107259089/AnsiballZ_dnf.py'
Dec 03 21:12:35 compute-0 sudo[104287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v180: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:35 compute-0 python3.9[104289]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 03 21:12:35 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Dec 03 21:12:35 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Dec 03 21:12:36 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec 03 21:12:36 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Dec 03 21:12:36 compute-0 ceph-mon[75204]: pgmap v180: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:36 compute-0 ceph-mon[75204]: 5.12 scrub starts
Dec 03 21:12:36 compute-0 ceph-mon[75204]: 5.12 scrub ok
Dec 03 21:12:36 compute-0 sudo[104287]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:37 compute-0 sudo[104440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqryzfakctznwjknaxvkdvranizxboai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796356.8025024-62-177536728987923/AnsiballZ_dnf.py'
Dec 03 21:12:37 compute-0 sudo[104440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v181: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:37 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec 03 21:12:37 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec 03 21:12:37 compute-0 ceph-mon[75204]: 2.18 scrub starts
Dec 03 21:12:37 compute-0 ceph-mon[75204]: 2.18 scrub ok
Dec 03 21:12:37 compute-0 python3.9[104442]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:12:38 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 03 21:12:38 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 03 21:12:38 compute-0 ceph-mon[75204]: pgmap v181: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:38 compute-0 ceph-mon[75204]: 5.1e scrub starts
Dec 03 21:12:38 compute-0 ceph-mon[75204]: 5.1e scrub ok
Dec 03 21:12:38 compute-0 sudo[104440]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v182: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:39 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Dec 03 21:12:39 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Dec 03 21:12:39 compute-0 ceph-mon[75204]: 2.19 scrub starts
Dec 03 21:12:39 compute-0 ceph-mon[75204]: 2.19 scrub ok
Dec 03 21:12:39 compute-0 sudo[104593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofjzycergiuxlbudtuopjdyrpbnjkxgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796358.9274614-70-38831313912129/AnsiballZ_systemd.py'
Dec 03 21:12:39 compute-0 sudo[104593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:39 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec 03 21:12:39 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec 03 21:12:39 compute-0 python3.9[104595]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:12:40 compute-0 sudo[104593]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:40 compute-0 ceph-mon[75204]: pgmap v182: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:40 compute-0 ceph-mon[75204]: 6.3 scrub starts
Dec 03 21:12:40 compute-0 ceph-mon[75204]: 6.3 scrub ok
Dec 03 21:12:40 compute-0 ceph-mon[75204]: 5.9 scrub starts
Dec 03 21:12:40 compute-0 ceph-mon[75204]: 5.9 scrub ok
Dec 03 21:12:40 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec 03 21:12:40 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec 03 21:12:40 compute-0 python3.9[104748]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:12:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v183: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:41 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Dec 03 21:12:41 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Dec 03 21:12:41 compute-0 ceph-mon[75204]: 5.16 scrub starts
Dec 03 21:12:41 compute-0 ceph-mon[75204]: 5.16 scrub ok
Dec 03 21:12:41 compute-0 sudo[104898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajtnpnnlayyklhkcblfikyswqtihgdyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796361.132065-88-12973893301522/AnsiballZ_sefcontext.py'
Dec 03 21:12:41 compute-0 sudo[104898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:41 compute-0 python3.9[104900]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 03 21:12:42 compute-0 sudo[104898]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:42 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Dec 03 21:12:42 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Dec 03 21:12:42 compute-0 ceph-mon[75204]: pgmap v183: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:42 compute-0 ceph-mon[75204]: 6.0 scrub starts
Dec 03 21:12:42 compute-0 ceph-mon[75204]: 6.0 scrub ok
Dec 03 21:12:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v184: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:43 compute-0 python3.9[105050]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:12:43 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec 03 21:12:43 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec 03 21:12:43 compute-0 ceph-mon[75204]: 6.7 scrub starts
Dec 03 21:12:43 compute-0 ceph-mon[75204]: 6.7 scrub ok
Dec 03 21:12:44 compute-0 sudo[105206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pibrwdjiwcqgchrvhteqieyblmiiqpos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796363.7812543-106-180164843738734/AnsiballZ_dnf.py'
Dec 03 21:12:44 compute-0 sudo[105206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:44 compute-0 python3.9[105208]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:12:44 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec 03 21:12:44 compute-0 ceph-mon[75204]: pgmap v184: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:44 compute-0 ceph-mon[75204]: 2.d scrub starts
Dec 03 21:12:44 compute-0 ceph-mon[75204]: 2.d scrub ok
Dec 03 21:12:44 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec 03 21:12:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v185: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:45 compute-0 ceph-mon[75204]: 6.9 scrub starts
Dec 03 21:12:45 compute-0 ceph-mon[75204]: 6.9 scrub ok
Dec 03 21:12:45 compute-0 sudo[105206]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:46 compute-0 sudo[105359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fspkcpdsejhkfwlmzrkctavziwoypvfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796365.8817422-114-31451180061801/AnsiballZ_command.py'
Dec 03 21:12:46 compute-0 sudo[105359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:46 compute-0 ceph-mon[75204]: pgmap v185: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:46 compute-0 python3.9[105361]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:12:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v186: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:47 compute-0 sudo[105359]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:47 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec 03 21:12:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Dec 03 21:12:47 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec 03 21:12:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Dec 03 21:12:47 compute-0 sudo[105646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynxnmkaqpnhgylhpqoxxcoexdhymjqgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796367.5094173-122-56946276926817/AnsiballZ_file.py'
Dec 03 21:12:47 compute-0 sudo[105646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:48 compute-0 python3.9[105648]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 03 21:12:48 compute-0 sudo[105646]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:48 compute-0 ceph-mon[75204]: pgmap v186: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:48 compute-0 ceph-mon[75204]: 4.9 scrub starts
Dec 03 21:12:48 compute-0 ceph-mon[75204]: 6.5 scrub starts
Dec 03 21:12:48 compute-0 ceph-mon[75204]: 4.9 scrub ok
Dec 03 21:12:48 compute-0 ceph-mon[75204]: 6.5 scrub ok
Dec 03 21:12:48 compute-0 python3.9[105798]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:12:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v187: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:49 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.a scrub starts
Dec 03 21:12:49 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.a scrub ok
Dec 03 21:12:49 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Dec 03 21:12:49 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Dec 03 21:12:49 compute-0 sudo[105950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxhmglbiodbtcwlwgrcmdfvzigdgpert ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796369.1754572-138-207470604975761/AnsiballZ_dnf.py'
Dec 03 21:12:49 compute-0 sudo[105950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:49 compute-0 python3.9[105952]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:12:50 compute-0 ceph-mon[75204]: pgmap v187: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:50 compute-0 ceph-mon[75204]: 6.a scrub starts
Dec 03 21:12:50 compute-0 ceph-mon[75204]: 6.a scrub ok
Dec 03 21:12:50 compute-0 ceph-mon[75204]: 2.3 scrub starts
Dec 03 21:12:50 compute-0 ceph-mon[75204]: 2.3 scrub ok
Dec 03 21:12:50 compute-0 sudo[105950]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v188: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:51 compute-0 sudo[106103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbchcejxtzejrqvkhhiufothotvoviwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796371.2033484-147-2207635560301/AnsiballZ_dnf.py'
Dec 03 21:12:51 compute-0 sudo[106103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:51 compute-0 ceph-mon[75204]: pgmap v188: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:12:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:12:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:12:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:12:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:12:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:12:51 compute-0 python3.9[106105]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:12:52 compute-0 sudo[106103]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v189: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:53 compute-0 sudo[106256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfrwzqtvkphgdthxfoslgeqvqryjrvua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796373.410543-159-186812078869978/AnsiballZ_stat.py'
Dec 03 21:12:53 compute-0 sudo[106256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:53 compute-0 python3.9[106258]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:12:53 compute-0 sudo[106256]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:54 compute-0 ceph-mon[75204]: pgmap v189: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:54 compute-0 sudo[106410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhqqkbyotcdoummnbdgobseojyffertw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796374.1305141-167-148777215991316/AnsiballZ_slurp.py'
Dec 03 21:12:54 compute-0 sudo[106410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:12:54 compute-0 python3.9[106412]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec 03 21:12:54 compute-0 sudo[106410]: pam_unix(sudo:session): session closed for user root
Dec 03 21:12:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v190: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:55 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec 03 21:12:55 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec 03 21:12:55 compute-0 sshd-session[103746]: Connection closed by 192.168.122.30 port 38166
Dec 03 21:12:55 compute-0 sshd-session[103743]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:12:55 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Dec 03 21:12:55 compute-0 systemd[1]: session-35.scope: Consumed 19.223s CPU time.
Dec 03 21:12:55 compute-0 systemd-logind[787]: Session 35 logged out. Waiting for processes to exit.
Dec 03 21:12:55 compute-0 systemd-logind[787]: Removed session 35.
Dec 03 21:12:56 compute-0 ceph-mon[75204]: pgmap v190: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:56 compute-0 ceph-mon[75204]: 4.5 scrub starts
Dec 03 21:12:56 compute-0 ceph-mon[75204]: 4.5 scrub ok
Dec 03 21:12:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:12:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v191: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:58 compute-0 ceph-mon[75204]: pgmap v191: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v192: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:12:59 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec 03 21:12:59 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec 03 21:13:00 compute-0 ceph-mon[75204]: pgmap v192: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:00 compute-0 ceph-mon[75204]: 2.a scrub starts
Dec 03 21:13:00 compute-0 ceph-mon[75204]: 2.a scrub ok
Dec 03 21:13:01 compute-0 sshd-session[106438]: Accepted publickey for zuul from 192.168.122.30 port 35064 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:13:01 compute-0 systemd-logind[787]: New session 36 of user zuul.
Dec 03 21:13:01 compute-0 systemd[1]: Started Session 36 of User zuul.
Dec 03 21:13:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v193: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:01 compute-0 sshd-session[106438]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:13:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:02 compute-0 ceph-mon[75204]: pgmap v193: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:02 compute-0 python3.9[106591]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:13:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v194: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:03 compute-0 python3.9[106745]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:13:04 compute-0 ceph-mon[75204]: pgmap v194: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:04 compute-0 python3.9[106939]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:13:04 compute-0 sshd-session[106441]: Connection closed by 192.168.122.30 port 35064
Dec 03 21:13:04 compute-0 sshd-session[106438]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:13:04 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Dec 03 21:13:04 compute-0 systemd[1]: session-36.scope: Consumed 2.600s CPU time.
Dec 03 21:13:04 compute-0 systemd-logind[787]: Session 36 logged out. Waiting for processes to exit.
Dec 03 21:13:05 compute-0 systemd-logind[787]: Removed session 36.
Dec 03 21:13:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v195: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:06 compute-0 ceph-mon[75204]: pgmap v195: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v196: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:08 compute-0 ceph-mon[75204]: pgmap v196: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v197: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:10 compute-0 sshd-session[106966]: Accepted publickey for zuul from 192.168.122.30 port 36038 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:13:10 compute-0 systemd-logind[787]: New session 37 of user zuul.
Dec 03 21:13:10 compute-0 systemd[1]: Started Session 37 of User zuul.
Dec 03 21:13:10 compute-0 sshd-session[106966]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:13:10 compute-0 ceph-mon[75204]: pgmap v197: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v198: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:11 compute-0 python3.9[107119]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:13:11 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 03 21:13:11 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 03 21:13:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:12 compute-0 ceph-mon[75204]: pgmap v198: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:12 compute-0 ceph-mon[75204]: 4.7 scrub starts
Dec 03 21:13:12 compute-0 ceph-mon[75204]: 4.7 scrub ok
Dec 03 21:13:12 compute-0 python3.9[107273]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:13:13 compute-0 sudo[107427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvtbhmxchcmldahytgbajzafepsxogen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796392.7846932-40-102379461678031/AnsiballZ_setup.py'
Dec 03 21:13:13 compute-0 sudo[107427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v199: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:13 compute-0 python3.9[107429]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:13:13 compute-0 sudo[107427]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:14 compute-0 sudo[107511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sapbozgbggpftbtjoevrvezhrbqhwxgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796392.7846932-40-102379461678031/AnsiballZ_dnf.py'
Dec 03 21:13:14 compute-0 sudo[107511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:14 compute-0 python3.9[107513]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:13:14 compute-0 ceph-mon[75204]: pgmap v199: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v200: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:15 compute-0 sudo[107511]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:16 compute-0 sudo[107665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iikakjkpbudgkqaphshsekstzwcvdbhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796395.738872-52-254148714416037/AnsiballZ_setup.py'
Dec 03 21:13:16 compute-0 sudo[107665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:16 compute-0 python3.9[107667]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:13:16 compute-0 ceph-mon[75204]: pgmap v200: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:16 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Dec 03 21:13:16 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Dec 03 21:13:16 compute-0 sudo[107665]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v201: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:17 compute-0 sudo[107860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzoukpsibdqmjacdjexqzvspkdbzsxvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796396.8595507-63-201185568734482/AnsiballZ_file.py'
Dec 03 21:13:17 compute-0 sudo[107860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:17 compute-0 ceph-mon[75204]: 2.5 scrub starts
Dec 03 21:13:17 compute-0 ceph-mon[75204]: 2.5 scrub ok
Dec 03 21:13:17 compute-0 python3.9[107862]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:13:17 compute-0 sudo[107860]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:18 compute-0 sudo[108012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttvaaqrsxxactiwqnoydixddqjhimozq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796397.772176-71-232004807043521/AnsiballZ_command.py'
Dec 03 21:13:18 compute-0 sudo[108012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:18 compute-0 python3.9[108014]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:13:18 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec 03 21:13:18 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec 03 21:13:18 compute-0 ceph-mon[75204]: pgmap v201: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:18 compute-0 sudo[108012]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:19 compute-0 sudo[108177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdyowbwhexonrfxwrxaejqywewwqrmke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796398.668589-79-180682588463358/AnsiballZ_stat.py'
Dec 03 21:13:19 compute-0 sudo[108177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v202: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:19 compute-0 python3.9[108179]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:13:19 compute-0 sudo[108177]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:19 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec 03 21:13:19 compute-0 ceph-mon[75204]: 2.4 scrub starts
Dec 03 21:13:19 compute-0 ceph-mon[75204]: 2.4 scrub ok
Dec 03 21:13:19 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec 03 21:13:19 compute-0 sudo[108255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llxosphrgsuxefiihhbvmqlnhrmavghd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796398.668589-79-180682588463358/AnsiballZ_file.py'
Dec 03 21:13:19 compute-0 sudo[108255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:19 compute-0 python3.9[108257]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:13:19 compute-0 sudo[108255]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:20 compute-0 ceph-mon[75204]: pgmap v202: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:20 compute-0 ceph-mon[75204]: 2.7 scrub starts
Dec 03 21:13:20 compute-0 ceph-mon[75204]: 2.7 scrub ok
Dec 03 21:13:20 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Dec 03 21:13:20 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Dec 03 21:13:20 compute-0 sudo[108407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fobbuhtppswzwnauvkzhevlbhlinugdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796400.2134938-91-193098496647831/AnsiballZ_stat.py'
Dec 03 21:13:20 compute-0 sudo[108407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:20 compute-0 python3.9[108409]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:13:20 compute-0 sudo[108407]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:21 compute-0 sudo[108485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qikefghuifpegvnzpszoprkgyuoygrlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796400.2134938-91-193098496647831/AnsiballZ_file.py'
Dec 03 21:13:21 compute-0 sudo[108485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:13:21
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['.mgr', 'cephfs.cephfs.data', 'cephfs.cephfs.meta', 'images', 'backups', 'vms', 'volumes']
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v203: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:21 compute-0 python3.9[108487]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:13:21 compute-0 sudo[108485]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:21 compute-0 ceph-mon[75204]: 5.1 scrub starts
Dec 03 21:13:21 compute-0 ceph-mon[75204]: 5.1 scrub ok
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:13:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:13:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:13:21 compute-0 sudo[108637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzmnjmwhcqprzkswotldbtdrmdtncdva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796401.5116653-104-81122579034138/AnsiballZ_ini_file.py'
Dec 03 21:13:21 compute-0 sudo[108637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:22 compute-0 python3.9[108639]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:13:22 compute-0 sudo[108637]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:22 compute-0 ceph-mon[75204]: pgmap v203: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:22 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 03 21:13:22 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec 03 21:13:22 compute-0 sudo[108789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnropoljxlfrgpjxfktlfkzlvebxarcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796402.3406608-104-275189765128565/AnsiballZ_ini_file.py'
Dec 03 21:13:22 compute-0 sudo[108789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:22 compute-0 python3.9[108791]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:13:22 compute-0 sudo[108789]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v204: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:23 compute-0 sudo[108962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrbpreegnupymownyuqxmoassjjzakwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796403.073517-104-52068722110730/AnsiballZ_ini_file.py'
Dec 03 21:13:23 compute-0 sudo[108962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:23 compute-0 sudo[108922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:13:23 compute-0 sudo[108922]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:23 compute-0 sudo[108922]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:23 compute-0 ceph-mon[75204]: 2.6 scrub starts
Dec 03 21:13:23 compute-0 ceph-mon[75204]: 2.6 scrub ok
Dec 03 21:13:23 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.f scrub starts
Dec 03 21:13:23 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.f scrub ok
Dec 03 21:13:23 compute-0 sudo[108969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:13:23 compute-0 sudo[108969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:23 compute-0 python3.9[108966]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:13:23 compute-0 sudo[108962]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:24 compute-0 sudo[108969]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:24 compute-0 sudo[109172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtexoitftezadtmiecuupgouxdmgdkcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796403.810771-104-135065131456296/AnsiballZ_ini_file.py'
Dec 03 21:13:24 compute-0 sudo[109172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:13:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:13:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:13:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:13:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:13:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:13:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:13:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:13:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:13:24 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:13:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:13:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:13:24 compute-0 sudo[109175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:13:24 compute-0 sudo[109175]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:24 compute-0 sudo[109175]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:24 compute-0 python3.9[109174]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:13:24 compute-0 sudo[109200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:13:24 compute-0 sudo[109172]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:24 compute-0 sudo[109200]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:24 compute-0 ceph-mon[75204]: pgmap v204: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:24 compute-0 ceph-mon[75204]: 5.f scrub starts
Dec 03 21:13:24 compute-0 ceph-mon[75204]: 5.f scrub ok
Dec 03 21:13:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:13:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:13:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:13:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:13:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:13:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:13:24 compute-0 podman[109284]: 2025-12-03 21:13:24.67463092 +0000 UTC m=+0.058946248 container create ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_turing, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default)
Dec 03 21:13:24 compute-0 systemd[1]: Started libpod-conmon-ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120.scope.
Dec 03 21:13:24 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:13:24 compute-0 podman[109284]: 2025-12-03 21:13:24.653105905 +0000 UTC m=+0.037421283 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:13:24 compute-0 podman[109284]: 2025-12-03 21:13:24.763197321 +0000 UTC m=+0.147512679 container init ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_turing, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:13:24 compute-0 podman[109284]: 2025-12-03 21:13:24.775159701 +0000 UTC m=+0.159475069 container start ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_turing, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:13:24 compute-0 podman[109284]: 2025-12-03 21:13:24.779272631 +0000 UTC m=+0.163587969 container attach ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_turing, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:13:24 compute-0 thirsty_turing[109336]: 167 167
Dec 03 21:13:24 compute-0 systemd[1]: libpod-ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120.scope: Deactivated successfully.
Dec 03 21:13:24 compute-0 podman[109284]: 2025-12-03 21:13:24.784793049 +0000 UTC m=+0.169108377 container died ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_turing, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:13:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-11f0a95f9dc99f3006ec7aa399e5a074bfd8629007f6baac81b30f94a4a35d18-merged.mount: Deactivated successfully.
Dec 03 21:13:24 compute-0 podman[109284]: 2025-12-03 21:13:24.826706821 +0000 UTC m=+0.211022159 container remove ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_turing, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:13:24 compute-0 systemd[1]: libpod-conmon-ececa866e2916f47eb9900c4039e100c54861795fe9e0acd7176bc14d3f6e120.scope: Deactivated successfully.
Dec 03 21:13:24 compute-0 sudo[109419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdjhdlvsjhzjvccspjsjoczzeystersv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796404.5931468-135-129350141409715/AnsiballZ_dnf.py'
Dec 03 21:13:24 compute-0 sudo[109419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:25 compute-0 podman[109427]: 2025-12-03 21:13:25.006939824 +0000 UTC m=+0.062443762 container create 7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_shamir, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:13:25 compute-0 systemd[1]: Started libpod-conmon-7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536.scope.
Dec 03 21:13:25 compute-0 podman[109427]: 2025-12-03 21:13:24.978077652 +0000 UTC m=+0.033581640 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:13:25 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a533eb7e03fc03bf36a511408a541ea6331fc595641ebb8c6923c884ff46b6d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a533eb7e03fc03bf36a511408a541ea6331fc595641ebb8c6923c884ff46b6d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a533eb7e03fc03bf36a511408a541ea6331fc595641ebb8c6923c884ff46b6d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a533eb7e03fc03bf36a511408a541ea6331fc595641ebb8c6923c884ff46b6d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a533eb7e03fc03bf36a511408a541ea6331fc595641ebb8c6923c884ff46b6d/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:25 compute-0 podman[109427]: 2025-12-03 21:13:25.116286341 +0000 UTC m=+0.171790279 container init 7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_shamir, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:13:25 compute-0 podman[109427]: 2025-12-03 21:13:25.124637734 +0000 UTC m=+0.180141672 container start 7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_shamir, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:13:25 compute-0 podman[109427]: 2025-12-03 21:13:25.129137025 +0000 UTC m=+0.184640993 container attach 7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_shamir, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:13:25 compute-0 python3.9[109426]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:13:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v205: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:25 compute-0 gallant_shamir[109443]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:13:25 compute-0 gallant_shamir[109443]: --> All data devices are unavailable
Dec 03 21:13:25 compute-0 systemd[1]: libpod-7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536.scope: Deactivated successfully.
Dec 03 21:13:25 compute-0 podman[109427]: 2025-12-03 21:13:25.733827079 +0000 UTC m=+0.789330987 container died 7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_shamir, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 03 21:13:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-6a533eb7e03fc03bf36a511408a541ea6331fc595641ebb8c6923c884ff46b6d-merged.mount: Deactivated successfully.
Dec 03 21:13:25 compute-0 podman[109427]: 2025-12-03 21:13:25.781364431 +0000 UTC m=+0.836868339 container remove 7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_shamir, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:13:25 compute-0 sshd-session[106437]: Connection reset by 118.196.51.24 port 59776 [preauth]
Dec 03 21:13:25 compute-0 systemd[1]: libpod-conmon-7a1e81dae286abc6245901e9ea14b804343050f540f4cb492d2c0b1b666d8536.scope: Deactivated successfully.
Dec 03 21:13:25 compute-0 sudo[109200]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:25 compute-0 sudo[109476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:13:25 compute-0 sudo[109476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:25 compute-0 sudo[109476]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:25 compute-0 sudo[109501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:13:25 compute-0 sudo[109501]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:26 compute-0 podman[109537]: 2025-12-03 21:13:26.282730268 +0000 UTC m=+0.044751727 container create de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_curie, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:13:26 compute-0 systemd[1]: Started libpod-conmon-de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d.scope.
Dec 03 21:13:26 compute-0 podman[109537]: 2025-12-03 21:13:26.263378741 +0000 UTC m=+0.025400240 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:13:26 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:13:26 compute-0 podman[109537]: 2025-12-03 21:13:26.398328393 +0000 UTC m=+0.160349942 container init de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_curie, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:13:26 compute-0 podman[109537]: 2025-12-03 21:13:26.410137419 +0000 UTC m=+0.172158908 container start de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_curie, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:13:26 compute-0 sudo[109419]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:26 compute-0 podman[109537]: 2025-12-03 21:13:26.414759653 +0000 UTC m=+0.176781212 container attach de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_curie, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle)
Dec 03 21:13:26 compute-0 infallible_curie[109554]: 167 167
Dec 03 21:13:26 compute-0 systemd[1]: libpod-de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d.scope: Deactivated successfully.
Dec 03 21:13:26 compute-0 podman[109537]: 2025-12-03 21:13:26.416222111 +0000 UTC m=+0.178243570 container died de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_curie, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle)
Dec 03 21:13:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-c75db3a862c3d6f25e17a0afe423d032c173e74ee674fcbcc543b275f2f73135-merged.mount: Deactivated successfully.
Dec 03 21:13:26 compute-0 podman[109537]: 2025-12-03 21:13:26.465707696 +0000 UTC m=+0.227729145 container remove de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_curie, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:13:26 compute-0 systemd[1]: libpod-conmon-de90cd022f8dc9e8a714fdd96c6abcb2940e035c4ae8653a967d1b29fce0a72d.scope: Deactivated successfully.
Dec 03 21:13:26 compute-0 ceph-mon[75204]: pgmap v205: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:26 compute-0 podman[109600]: 2025-12-03 21:13:26.630665411 +0000 UTC m=+0.054011077 container create 817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_leavitt, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:13:26 compute-0 systemd[1]: Started libpod-conmon-817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4.scope.
Dec 03 21:13:26 compute-0 podman[109600]: 2025-12-03 21:13:26.603165035 +0000 UTC m=+0.026510791 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:13:26 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:13:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/593d36532f230cd0f1643f29b9aeb72c6a0c8ee58aea6344b2509f8d1d646cef/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/593d36532f230cd0f1643f29b9aeb72c6a0c8ee58aea6344b2509f8d1d646cef/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/593d36532f230cd0f1643f29b9aeb72c6a0c8ee58aea6344b2509f8d1d646cef/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/593d36532f230cd0f1643f29b9aeb72c6a0c8ee58aea6344b2509f8d1d646cef/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:26 compute-0 podman[109600]: 2025-12-03 21:13:26.732847645 +0000 UTC m=+0.156193331 container init 817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_leavitt, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 03 21:13:26 compute-0 podman[109600]: 2025-12-03 21:13:26.740734097 +0000 UTC m=+0.164079763 container start 817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_leavitt, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 03 21:13:26 compute-0 podman[109600]: 2025-12-03 21:13:26.745833193 +0000 UTC m=+0.169178879 container attach 817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_leavitt, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True)
Dec 03 21:13:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]: {
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:     "0": [
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:         {
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "devices": [
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "/dev/loop3"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             ],
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_name": "ceph_lv0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_size": "21470642176",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "name": "ceph_lv0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "tags": {
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cluster_name": "ceph",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.crush_device_class": "",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.encrypted": "0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.objectstore": "bluestore",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osd_id": "0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.type": "block",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.vdo": "0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.with_tpm": "0"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             },
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "type": "block",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "vg_name": "ceph_vg0"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:         }
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:     ],
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:     "1": [
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:         {
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "devices": [
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "/dev/loop4"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             ],
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_name": "ceph_lv1",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_size": "21470642176",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "name": "ceph_lv1",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "tags": {
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cluster_name": "ceph",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.crush_device_class": "",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.encrypted": "0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.objectstore": "bluestore",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osd_id": "1",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.type": "block",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.vdo": "0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.with_tpm": "0"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             },
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "type": "block",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "vg_name": "ceph_vg1"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:         }
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:     ],
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:     "2": [
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:         {
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "devices": [
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "/dev/loop5"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             ],
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_name": "ceph_lv2",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_size": "21470642176",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "name": "ceph_lv2",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "tags": {
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.cluster_name": "ceph",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.crush_device_class": "",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.encrypted": "0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.objectstore": "bluestore",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osd_id": "2",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.type": "block",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.vdo": "0",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:                 "ceph.with_tpm": "0"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             },
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "type": "block",
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:             "vg_name": "ceph_vg2"
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:         }
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]:     ]
Dec 03 21:13:27 compute-0 jovial_leavitt[109617]: }
Dec 03 21:13:27 compute-0 sudo[109751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clabbmmdlgylpwuebwkoortchncxhybu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796406.775602-146-99817038991014/AnsiballZ_setup.py'
Dec 03 21:13:27 compute-0 sudo[109751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:27 compute-0 systemd[1]: libpod-817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4.scope: Deactivated successfully.
Dec 03 21:13:27 compute-0 podman[109600]: 2025-12-03 21:13:27.102901199 +0000 UTC m=+0.526246935 container died 817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_leavitt, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:13:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-593d36532f230cd0f1643f29b9aeb72c6a0c8ee58aea6344b2509f8d1d646cef-merged.mount: Deactivated successfully.
Dec 03 21:13:27 compute-0 podman[109600]: 2025-12-03 21:13:27.163781259 +0000 UTC m=+0.587126955 container remove 817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_leavitt, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:13:27 compute-0 systemd[1]: libpod-conmon-817f0f2b97ffd928a48e4666d863bab72b3b486f68c66e150841f63d2d701cc4.scope: Deactivated successfully.
Dec 03 21:13:27 compute-0 sudo[109501]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v206: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:27 compute-0 sudo[109768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:13:27 compute-0 sudo[109768]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:27 compute-0 sudo[109768]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:27 compute-0 sudo[109793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:13:27 compute-0 sudo[109793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:27 compute-0 python3.9[109753]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:13:27 compute-0 sudo[109751]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:13:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:13:27 compute-0 podman[109881]: 2025-12-03 21:13:27.701220492 +0000 UTC m=+0.052817974 container create 84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_brattain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030)
Dec 03 21:13:27 compute-0 systemd[1]: Started libpod-conmon-84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986.scope.
Dec 03 21:13:27 compute-0 podman[109881]: 2025-12-03 21:13:27.684172376 +0000 UTC m=+0.035769868 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:13:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:13:27 compute-0 podman[109881]: 2025-12-03 21:13:27.809357116 +0000 UTC m=+0.160954608 container init 84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_brattain, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:13:27 compute-0 podman[109881]: 2025-12-03 21:13:27.821369798 +0000 UTC m=+0.172967310 container start 84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_brattain, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle)
Dec 03 21:13:27 compute-0 podman[109881]: 2025-12-03 21:13:27.825396736 +0000 UTC m=+0.176994228 container attach 84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_brattain, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:13:27 compute-0 agitated_brattain[109941]: 167 167
Dec 03 21:13:27 compute-0 systemd[1]: libpod-84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986.scope: Deactivated successfully.
Dec 03 21:13:27 compute-0 podman[109881]: 2025-12-03 21:13:27.828842878 +0000 UTC m=+0.180440420 container died 84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_brattain, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:13:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-965802ae162c59b72d941624b2a6269f63d6e6e2aac0afa1942601f39185d983-merged.mount: Deactivated successfully.
Dec 03 21:13:27 compute-0 podman[109881]: 2025-12-03 21:13:27.883251834 +0000 UTC m=+0.234849316 container remove 84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_brattain, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:13:27 compute-0 systemd[1]: libpod-conmon-84283aaf51ee6fbd337265037466b8790a80dda67424df51a239f55cf37e6986.scope: Deactivated successfully.
Dec 03 21:13:27 compute-0 sudo[110016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzfjqfjuigvahdbjybwuhbopukugnmzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796407.612872-154-444747082356/AnsiballZ_stat.py'
Dec 03 21:13:27 compute-0 sudo[110016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:28 compute-0 podman[110024]: 2025-12-03 21:13:28.105457851 +0000 UTC m=+0.057408727 container create 6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_burnell, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:13:28 compute-0 systemd[76584]: Created slice User Background Tasks Slice.
Dec 03 21:13:28 compute-0 systemd[76584]: Starting Cleanup of User's Temporary Files and Directories...
Dec 03 21:13:28 compute-0 systemd[1]: Started libpod-conmon-6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5.scope.
Dec 03 21:13:28 compute-0 systemd[76584]: Finished Cleanup of User's Temporary Files and Directories.
Dec 03 21:13:28 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:13:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d08f100ec548369a78c276cac6c4e72884f6eb6c1f47fa08dda7fcc9f96e894/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d08f100ec548369a78c276cac6c4e72884f6eb6c1f47fa08dda7fcc9f96e894/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d08f100ec548369a78c276cac6c4e72884f6eb6c1f47fa08dda7fcc9f96e894/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2d08f100ec548369a78c276cac6c4e72884f6eb6c1f47fa08dda7fcc9f96e894/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:13:28 compute-0 podman[110024]: 2025-12-03 21:13:28.174229082 +0000 UTC m=+0.126179958 container init 6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_burnell, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:13:28 compute-0 podman[110024]: 2025-12-03 21:13:28.079815585 +0000 UTC m=+0.031766451 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:13:28 compute-0 podman[110024]: 2025-12-03 21:13:28.193458807 +0000 UTC m=+0.145409673 container start 6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_burnell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:13:28 compute-0 podman[110024]: 2025-12-03 21:13:28.197301639 +0000 UTC m=+0.149252515 container attach 6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_burnell, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:13:28 compute-0 python3.9[110018]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:13:28 compute-0 sudo[110016]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:28 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec 03 21:13:28 compute-0 ceph-mon[75204]: pgmap v206: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:28 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec 03 21:13:28 compute-0 sudo[110247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpqhtlnqwhgsgtapakuxqdhseivdsvlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796408.5232873-163-122272332743185/AnsiballZ_stat.py'
Dec 03 21:13:28 compute-0 sudo[110247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:28 compute-0 lvm[110271]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:13:28 compute-0 lvm[110271]: VG ceph_vg0 finished
Dec 03 21:13:28 compute-0 lvm[110274]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:13:28 compute-0 lvm[110274]: VG ceph_vg1 finished
Dec 03 21:13:28 compute-0 python3.9[110253]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:13:28 compute-0 sudo[110247]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:28 compute-0 lvm[110276]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:13:28 compute-0 lvm[110276]: VG ceph_vg2 finished
Dec 03 21:13:29 compute-0 compassionate_burnell[110042]: {}
Dec 03 21:13:29 compute-0 systemd[1]: libpod-6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5.scope: Deactivated successfully.
Dec 03 21:13:29 compute-0 systemd[1]: libpod-6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5.scope: Consumed 1.382s CPU time.
Dec 03 21:13:29 compute-0 podman[110024]: 2025-12-03 21:13:29.118718029 +0000 UTC m=+1.070668885 container died 6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_burnell, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:13:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-2d08f100ec548369a78c276cac6c4e72884f6eb6c1f47fa08dda7fcc9f96e894-merged.mount: Deactivated successfully.
Dec 03 21:13:29 compute-0 podman[110024]: 2025-12-03 21:13:29.164206537 +0000 UTC m=+1.116157363 container remove 6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_burnell, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:13:29 compute-0 systemd[1]: libpod-conmon-6dd5bff3bee5e442811fd03f928034f40c606689843645d18f1eaff81384d0b5.scope: Deactivated successfully.
Dec 03 21:13:29 compute-0 sudo[109793]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:13:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:13:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:13:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v207: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:13:29 compute-0 sudo[110338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:13:29 compute-0 sudo[110338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:13:29 compute-0 sudo[110338]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:29 compute-0 sudo[110465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxizczqboubmogphnfrmhohgipjukcof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796409.2414129-173-245405023010486/AnsiballZ_command.py'
Dec 03 21:13:29 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 03 21:13:29 compute-0 sudo[110465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:29 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 03 21:13:29 compute-0 python3.9[110467]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:13:29 compute-0 sudo[110465]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:29 compute-0 ceph-mon[75204]: 5.c scrub starts
Dec 03 21:13:29 compute-0 ceph-mon[75204]: 5.c scrub ok
Dec 03 21:13:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:13:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:13:30 compute-0 sudo[110618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-japwlttjdjyhjgqyfkkgtmisrzknkmqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796410.0779445-183-35174848851788/AnsiballZ_service_facts.py'
Dec 03 21:13:30 compute-0 sudo[110618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:30 compute-0 ceph-mon[75204]: pgmap v207: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:30 compute-0 ceph-mon[75204]: 5.1d scrub starts
Dec 03 21:13:30 compute-0 ceph-mon[75204]: 5.1d scrub ok
Dec 03 21:13:30 compute-0 python3.9[110620]: ansible-service_facts Invoked
Dec 03 21:13:30 compute-0 network[110637]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:13:30 compute-0 network[110638]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:13:30 compute-0 network[110639]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:13:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v208: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:31 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 03 21:13:31 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 03 21:13:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:31 compute-0 ceph-mon[75204]: pgmap v208: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:31 compute-0 ceph-mon[75204]: 5.19 scrub starts
Dec 03 21:13:31 compute-0 ceph-mon[75204]: 5.19 scrub ok
Dec 03 21:13:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v209: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:33 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec 03 21:13:33 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec 03 21:13:34 compute-0 sudo[110618]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:34 compute-0 ceph-mon[75204]: pgmap v209: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:34 compute-0 ceph-mon[75204]: 2.9 scrub starts
Dec 03 21:13:34 compute-0 ceph-mon[75204]: 2.9 scrub ok
Dec 03 21:13:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v210: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:35 compute-0 sudo[110922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-precwrnyrnqyedgjjjusffucnptntqjy ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764796414.888815-198-79310728576158/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764796414.888815-198-79310728576158/args'
Dec 03 21:13:35 compute-0 sudo[110922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:35 compute-0 sudo[110922]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:35 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec 03 21:13:35 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec 03 21:13:35 compute-0 sudo[111089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saceiawtsvpzzeamdlfgkxqbkmzhrucn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796415.6071713-209-173142450668863/AnsiballZ_dnf.py'
Dec 03 21:13:35 compute-0 sudo[111089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:36 compute-0 python3.9[111091]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:13:36 compute-0 ceph-mon[75204]: pgmap v210: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:36 compute-0 ceph-mon[75204]: 5.1a scrub starts
Dec 03 21:13:36 compute-0 ceph-mon[75204]: 5.1a scrub ok
Dec 03 21:13:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v211: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:37 compute-0 sudo[111089]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:38 compute-0 ceph-mon[75204]: pgmap v211: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:38 compute-0 sudo[111242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hamvparbrxjphgbkzakxuykcrksujtgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796417.7764115-222-117564119239976/AnsiballZ_package_facts.py'
Dec 03 21:13:38 compute-0 sudo[111242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:38 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Dec 03 21:13:38 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Dec 03 21:13:38 compute-0 python3.9[111244]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 03 21:13:38 compute-0 sudo[111242]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v212: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:39 compute-0 ceph-mon[75204]: 5.18 scrub starts
Dec 03 21:13:39 compute-0 ceph-mon[75204]: 5.18 scrub ok
Dec 03 21:13:39 compute-0 sudo[111394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkbcbnviommjdhfamzkqyqqhwwgclebk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796419.3408809-232-161686609413109/AnsiballZ_stat.py'
Dec 03 21:13:39 compute-0 sudo[111394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:39 compute-0 python3.9[111396]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:13:39 compute-0 sudo[111394]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:40 compute-0 sudo[111472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qojbczormgyfaiflqtbvfgcwvyovagzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796419.3408809-232-161686609413109/AnsiballZ_file.py'
Dec 03 21:13:40 compute-0 sudo[111472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:40 compute-0 ceph-mon[75204]: pgmap v212: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:40 compute-0 python3.9[111474]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:13:40 compute-0 sudo[111472]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:40 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 03 21:13:40 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 03 21:13:40 compute-0 sudo[111624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhvqlefkskgczbmormommpsefmscttlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796420.6488383-244-156671705170197/AnsiballZ_stat.py'
Dec 03 21:13:40 compute-0 sudo[111624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:41 compute-0 python3.9[111626]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:13:41 compute-0 sudo[111624]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v213: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:41 compute-0 sudo[111702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfilcbrznvxxnwhoqgnkbwhjeyqrsbvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796420.6488383-244-156671705170197/AnsiballZ_file.py'
Dec 03 21:13:41 compute-0 sudo[111702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:41 compute-0 ceph-mon[75204]: 6.4 scrub starts
Dec 03 21:13:41 compute-0 ceph-mon[75204]: 6.4 scrub ok
Dec 03 21:13:41 compute-0 python3.9[111704]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:13:41 compute-0 sudo[111702]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 03 21:13:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 03 21:13:42 compute-0 ceph-mon[75204]: pgmap v213: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:42 compute-0 sudo[111854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvwprnlngmvecarglesevuvfgrpadgcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796422.2025769-262-229389179489933/AnsiballZ_lineinfile.py'
Dec 03 21:13:42 compute-0 sudo[111854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:42 compute-0 python3.9[111856]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:13:42 compute-0 sudo[111854]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v214: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:43 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec 03 21:13:43 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec 03 21:13:43 compute-0 ceph-mon[75204]: 6.b scrub starts
Dec 03 21:13:43 compute-0 ceph-mon[75204]: 6.b scrub ok
Dec 03 21:13:43 compute-0 sudo[112006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixramujpkrgarvnlpykrphsuqlwjavvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796423.3566194-277-124135319837152/AnsiballZ_setup.py'
Dec 03 21:13:43 compute-0 sudo[112006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:43 compute-0 python3.9[112008]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:13:44 compute-0 sudo[112006]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:44 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 03 21:13:44 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 03 21:13:44 compute-0 ceph-mon[75204]: pgmap v214: 177 pgs: 177 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:44 compute-0 ceph-mon[75204]: 6.1 scrub starts
Dec 03 21:13:44 compute-0 ceph-mon[75204]: 6.1 scrub ok
Dec 03 21:13:44 compute-0 sudo[112090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmcuztynvoqqpuyzjfwistdyeeqzxewn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796423.3566194-277-124135319837152/AnsiballZ_systemd.py'
Dec 03 21:13:44 compute-0 sudo[112090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:45 compute-0 python3.9[112092]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:13:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v215: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:45 compute-0 sudo[112090]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:45 compute-0 ceph-mon[75204]: 6.6 scrub starts
Dec 03 21:13:45 compute-0 ceph-mon[75204]: 6.6 scrub ok
Dec 03 21:13:45 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec 03 21:13:45 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec 03 21:13:45 compute-0 sshd-session[106969]: Connection closed by 192.168.122.30 port 36038
Dec 03 21:13:45 compute-0 sshd-session[106966]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:13:45 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Dec 03 21:13:45 compute-0 systemd[1]: session-37.scope: Consumed 26.024s CPU time.
Dec 03 21:13:45 compute-0 systemd-logind[787]: Session 37 logged out. Waiting for processes to exit.
Dec 03 21:13:45 compute-0 systemd-logind[787]: Removed session 37.
Dec 03 21:13:46 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec 03 21:13:46 compute-0 ceph-mon[75204]: pgmap v215: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:46 compute-0 ceph-mon[75204]: 6.2 scrub starts
Dec 03 21:13:46 compute-0 ceph-mon[75204]: 6.2 scrub ok
Dec 03 21:13:46 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec 03 21:13:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v216: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:47 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 03 21:13:47 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 03 21:13:47 compute-0 ceph-mon[75204]: 6.d scrub starts
Dec 03 21:13:47 compute-0 ceph-mon[75204]: 6.d scrub ok
Dec 03 21:13:47 compute-0 ceph-mon[75204]: pgmap v216: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:48 compute-0 ceph-mon[75204]: 6.c scrub starts
Dec 03 21:13:48 compute-0 ceph-mon[75204]: 6.c scrub ok
Dec 03 21:13:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v217: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:49 compute-0 ceph-mon[75204]: pgmap v217: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v218: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:51 compute-0 sshd-session[112119]: Accepted publickey for zuul from 192.168.122.30 port 46056 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:13:51 compute-0 systemd-logind[787]: New session 38 of user zuul.
Dec 03 21:13:51 compute-0 systemd[1]: Started Session 38 of User zuul.
Dec 03 21:13:51 compute-0 sshd-session[112119]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:13:51 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec 03 21:13:51 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec 03 21:13:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:13:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:13:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:13:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:13:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:13:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:13:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:52 compute-0 sudo[112272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cisxjcqcksmunwhyyzkbdwodlvozddsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796431.6145272-22-114955025293737/AnsiballZ_file.py'
Dec 03 21:13:52 compute-0 sudo[112272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:52 compute-0 ceph-mon[75204]: pgmap v218: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:52 compute-0 ceph-mon[75204]: 6.e scrub starts
Dec 03 21:13:52 compute-0 ceph-mon[75204]: 6.e scrub ok
Dec 03 21:13:52 compute-0 python3.9[112274]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:13:52 compute-0 sudo[112272]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:52 compute-0 sudo[112424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdtrxokjixjgnhtgycfhtapvqvjyguuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796432.5534399-34-250209131791741/AnsiballZ_stat.py'
Dec 03 21:13:52 compute-0 sudo[112424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:53 compute-0 python3.9[112426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:13:53 compute-0 sudo[112424]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v219: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:53 compute-0 sudo[112502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqamzjhinsdgkosqlrtavxulfsirzyas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796432.5534399-34-250209131791741/AnsiballZ_file.py'
Dec 03 21:13:53 compute-0 sudo[112502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:13:53 compute-0 python3.9[112504]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:13:53 compute-0 sudo[112502]: pam_unix(sudo:session): session closed for user root
Dec 03 21:13:54 compute-0 sshd-session[112122]: Connection closed by 192.168.122.30 port 46056
Dec 03 21:13:54 compute-0 sshd-session[112119]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:13:54 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Dec 03 21:13:54 compute-0 systemd[1]: session-38.scope: Consumed 1.811s CPU time.
Dec 03 21:13:54 compute-0 systemd-logind[787]: Session 38 logged out. Waiting for processes to exit.
Dec 03 21:13:54 compute-0 systemd-logind[787]: Removed session 38.
Dec 03 21:13:54 compute-0 ceph-mon[75204]: pgmap v219: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v220: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:56 compute-0 ceph-mon[75204]: pgmap v220: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:13:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v221: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:58 compute-0 ceph-mon[75204]: pgmap v221: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v222: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:13:59 compute-0 sshd-session[112529]: Accepted publickey for zuul from 192.168.122.30 port 54120 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:13:59 compute-0 systemd-logind[787]: New session 39 of user zuul.
Dec 03 21:13:59 compute-0 systemd[1]: Started Session 39 of User zuul.
Dec 03 21:14:00 compute-0 sshd-session[112529]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:14:00 compute-0 ceph-mon[75204]: pgmap v222: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:01 compute-0 python3.9[112682]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:14:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v223: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:02 compute-0 sudo[112836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yihyfqjvkxabktqhihfsagzqgkofypib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796441.59679-33-111560709464578/AnsiballZ_file.py'
Dec 03 21:14:02 compute-0 sudo[112836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:02 compute-0 python3.9[112838]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:02 compute-0 sudo[112836]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:02 compute-0 ceph-mon[75204]: pgmap v223: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:03 compute-0 sudo[113011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhhzwhxmoeuzrcmnblbhhrsbehayxyyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796442.5954933-41-111549449216244/AnsiballZ_stat.py'
Dec 03 21:14:03 compute-0 sudo[113011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v224: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:03 compute-0 python3.9[113013]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:03 compute-0 sudo[113011]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:03 compute-0 sudo[113089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leabdlkrvcynjbiywvmtpewkbhhzshkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796442.5954933-41-111549449216244/AnsiballZ_file.py'
Dec 03 21:14:03 compute-0 sudo[113089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:03 compute-0 python3.9[113091]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.r4rq9cn2 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:03 compute-0 sudo[113089]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:04 compute-0 ceph-mon[75204]: pgmap v224: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:04 compute-0 sudo[113241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzfnrcxaipfvcqyifkxkhbwaxfrrpsco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796444.4449313-61-74930887631256/AnsiballZ_stat.py'
Dec 03 21:14:04 compute-0 sudo[113241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:04 compute-0 python3.9[113243]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:04 compute-0 sudo[113241]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v225: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:05 compute-0 sudo[113319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aspqjhatujtajbjlxgxqgvmcqislqjym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796444.4449313-61-74930887631256/AnsiballZ_file.py'
Dec 03 21:14:05 compute-0 sudo[113319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:05 compute-0 python3.9[113321]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.h9vsxned recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:05 compute-0 sudo[113319]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:06 compute-0 sudo[113471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkzdteozcyqfgwbbrphtliqhuvkoytjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796445.740621-74-25195091941834/AnsiballZ_file.py'
Dec 03 21:14:06 compute-0 sudo[113471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:06 compute-0 python3.9[113473]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:14:06 compute-0 sudo[113471]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:06 compute-0 ceph-mon[75204]: pgmap v225: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:06 compute-0 sudo[113623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyfhebiatnyvkswnewimhzthddilltce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796446.403859-82-154450322435122/AnsiballZ_stat.py'
Dec 03 21:14:06 compute-0 sudo[113623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:06 compute-0 python3.9[113625]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:06 compute-0 sudo[113623]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:07 compute-0 sudo[113701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbzndyxxqidinxfintracqhtmvezweie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796446.403859-82-154450322435122/AnsiballZ_file.py'
Dec 03 21:14:07 compute-0 sudo[113701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v226: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:07 compute-0 python3.9[113703]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:14:07 compute-0 sudo[113701]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:07 compute-0 sudo[113853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xauiwgqwauynmxhryfsppfzukggoicme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796447.5006928-82-273774870434325/AnsiballZ_stat.py'
Dec 03 21:14:07 compute-0 sudo[113853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:07 compute-0 python3.9[113855]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:08 compute-0 sudo[113853]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:08 compute-0 sudo[113931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfkfoyqajajwbmotnddicdzpawejcyjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796447.5006928-82-273774870434325/AnsiballZ_file.py'
Dec 03 21:14:08 compute-0 sudo[113931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:08 compute-0 ceph-mon[75204]: pgmap v226: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:08 compute-0 python3.9[113933]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:14:08 compute-0 sudo[113931]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:09 compute-0 sudo[114083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjppqzjktskbkqxglvwqvtcrvkgpiukm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796448.7115316-105-185573720064007/AnsiballZ_file.py'
Dec 03 21:14:09 compute-0 sudo[114083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:09 compute-0 python3.9[114085]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:09 compute-0 sudo[114083]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v227: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:09 compute-0 sudo[114235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raiaiciykcxkoauxayajvuyroajpqrap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796449.4432557-113-25113199808265/AnsiballZ_stat.py'
Dec 03 21:14:09 compute-0 sudo[114235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:09 compute-0 python3.9[114237]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:09 compute-0 sudo[114235]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:10 compute-0 sudo[114314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fonvbvgnimrtvagtmucusskohnxcahyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796449.4432557-113-25113199808265/AnsiballZ_file.py'
Dec 03 21:14:10 compute-0 sudo[114314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:10 compute-0 python3.9[114316]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:10 compute-0 sudo[114314]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:10 compute-0 ceph-mon[75204]: pgmap v227: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:10 compute-0 sudo[114466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soefbckxaoauwylzzwfdqyvnxbrsjaxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796450.5927513-125-192703803303934/AnsiballZ_stat.py'
Dec 03 21:14:10 compute-0 sudo[114466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:11 compute-0 python3.9[114468]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:11 compute-0 sudo[114466]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v228: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:11 compute-0 sudo[114544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nddrfghmouutfffydovbzgchphtqbjcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796450.5927513-125-192703803303934/AnsiballZ_file.py'
Dec 03 21:14:11 compute-0 sudo[114544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:11 compute-0 python3.9[114546]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:11 compute-0 sudo[114544]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:12 compute-0 sudo[114696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuigrieirwhycmitngqwzwlecmzdmetq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796451.7734985-137-30702085846268/AnsiballZ_systemd.py'
Dec 03 21:14:12 compute-0 sudo[114696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:12 compute-0 ceph-mon[75204]: pgmap v228: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:12 compute-0 python3.9[114698]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:14:12 compute-0 systemd[1]: Reloading.
Dec 03 21:14:12 compute-0 systemd-sysv-generator[114728]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:14:12 compute-0 systemd-rc-local-generator[114723]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:14:13 compute-0 sudo[114696]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v229: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:13 compute-0 sudo[114886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wroecbouqtatanzhzhhjthikesswyumt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796453.368316-145-119620070985957/AnsiballZ_stat.py'
Dec 03 21:14:13 compute-0 sudo[114886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:13 compute-0 python3.9[114888]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:13 compute-0 sudo[114886]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:14 compute-0 sudo[114964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgtxyywwdbrcsuawccubzqoyxraqwnce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796453.368316-145-119620070985957/AnsiballZ_file.py'
Dec 03 21:14:14 compute-0 sudo[114964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:14 compute-0 python3.9[114966]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:14 compute-0 sudo[114964]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:14 compute-0 ceph-mon[75204]: pgmap v229: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:15 compute-0 sudo[115116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycrftqwttovtsgjeiqtqniqfhygboqrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796454.6007602-157-109180715625948/AnsiballZ_stat.py'
Dec 03 21:14:15 compute-0 sudo[115116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:15 compute-0 python3.9[115118]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v230: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:15 compute-0 sudo[115116]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:15 compute-0 sudo[115194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mptfehtfhxkpnvxbcuojaaockupcqfbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796454.6007602-157-109180715625948/AnsiballZ_file.py'
Dec 03 21:14:15 compute-0 sudo[115194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:15 compute-0 python3.9[115196]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:15 compute-0 sudo[115194]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:16 compute-0 sudo[115346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyjsptfnipdnrrfaoluqymlsjdkdmoxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796455.931915-169-193535575020958/AnsiballZ_systemd.py'
Dec 03 21:14:16 compute-0 sudo[115346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:16 compute-0 ceph-mon[75204]: pgmap v230: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:16 compute-0 python3.9[115348]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:14:16 compute-0 systemd[1]: Reloading.
Dec 03 21:14:16 compute-0 systemd-rc-local-generator[115379]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:14:16 compute-0 systemd-sysv-generator[115383]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:14:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:16 compute-0 systemd[1]: Starting Create netns directory...
Dec 03 21:14:16 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 03 21:14:16 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 03 21:14:16 compute-0 systemd[1]: Finished Create netns directory.
Dec 03 21:14:17 compute-0 sudo[115346]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v231: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:17 compute-0 python3.9[115543]: ansible-ansible.builtin.service_facts Invoked
Dec 03 21:14:17 compute-0 network[115560]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:14:17 compute-0 network[115561]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:14:17 compute-0 network[115562]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:14:18 compute-0 ceph-mon[75204]: pgmap v231: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v232: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:20 compute-0 ceph-mon[75204]: pgmap v232: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:14:21
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.meta', '.mgr', 'cephfs.cephfs.data', 'volumes', 'vms', 'backups', 'images']
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v233: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:21 compute-0 ceph-mon[75204]: pgmap v233: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:21 compute-0 sudo[115822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmjgywgvtipfebrfuhtelvqbewriyrqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796461.3207588-195-234747110802837/AnsiballZ_stat.py'
Dec 03 21:14:21 compute-0 sudo[115822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:14:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:14:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:14:21 compute-0 python3.9[115824]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:21 compute-0 sudo[115822]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:22 compute-0 sudo[115900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtapzdkhdyvkxkrelqtykwhenmeevasz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796461.3207588-195-234747110802837/AnsiballZ_file.py'
Dec 03 21:14:22 compute-0 sudo[115900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:22 compute-0 python3.9[115902]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:22 compute-0 sudo[115900]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:22 compute-0 sudo[116052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzdwmukgvlavjgfrvwygscsqtqsymcfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796462.5176957-208-245117035506699/AnsiballZ_file.py'
Dec 03 21:14:22 compute-0 sudo[116052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:23 compute-0 python3.9[116054]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:23 compute-0 sudo[116052]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v234: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:23 compute-0 sudo[116204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijwabwdjywjxowbvmiofgrdyswtnlpuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796463.222518-216-66067272427714/AnsiballZ_stat.py'
Dec 03 21:14:23 compute-0 sudo[116204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:23 compute-0 python3.9[116206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:23 compute-0 sudo[116204]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:23 compute-0 sudo[116282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocqvfqzajtgtymggcviplvhtbbpsyvpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796463.222518-216-66067272427714/AnsiballZ_file.py'
Dec 03 21:14:23 compute-0 sudo[116282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:24 compute-0 python3.9[116284]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:24 compute-0 sudo[116282]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:24 compute-0 ceph-mon[75204]: pgmap v234: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:24 compute-0 sudo[116434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdjlqdhqbzfvpsclpoyuspccyghzjprc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796464.4799485-231-136182120828811/AnsiballZ_timezone.py'
Dec 03 21:14:24 compute-0 sudo[116434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:25 compute-0 python3.9[116436]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 03 21:14:25 compute-0 systemd[1]: Starting Time & Date Service...
Dec 03 21:14:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v235: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:25 compute-0 systemd[1]: Started Time & Date Service.
Dec 03 21:14:25 compute-0 sudo[116434]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:25 compute-0 sudo[116590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzoqlogbdlagjuyepueutbwhqhaywpzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796465.592162-240-1000927872562/AnsiballZ_file.py'
Dec 03 21:14:25 compute-0 sudo[116590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:26 compute-0 python3.9[116592]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:26 compute-0 sudo[116590]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:26 compute-0 sudo[116742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buebhdlvbpfbpoifjwzpsupfknmgtqlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796466.315134-248-246361766667192/AnsiballZ_stat.py'
Dec 03 21:14:26 compute-0 sudo[116742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:26 compute-0 python3.9[116744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:26 compute-0 sudo[116742]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:27 compute-0 sudo[116820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snikiivxtxjosnklqhynuhbfvmvgcikl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796466.315134-248-246361766667192/AnsiballZ_file.py'
Dec 03 21:14:27 compute-0 sudo[116820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v236: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:27 compute-0 python3.9[116822]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:27 compute-0 sudo[116820]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:14:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:14:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:27 compute-0 ceph-mon[75204]: pgmap v235: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:27 compute-0 ceph-mon[75204]: pgmap v236: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:28 compute-0 sudo[116972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uniemyqjwnxwuxfhanbnyeuogawwsltz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796467.8709283-260-244755902932000/AnsiballZ_stat.py'
Dec 03 21:14:28 compute-0 sudo[116972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:28 compute-0 python3.9[116974]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:28 compute-0 sudo[116972]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:28 compute-0 sudo[117050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejawjngelzcjsvhkczceptamcibpwrqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796467.8709283-260-244755902932000/AnsiballZ_file.py'
Dec 03 21:14:28 compute-0 sudo[117050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:28 compute-0 python3.9[117052]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.lo55t4w7 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:28 compute-0 sudo[117050]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v237: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:29 compute-0 sudo[117202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwqlnokrvujajcqwezwepabriypqqibo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796468.985677-272-154306420359871/AnsiballZ_stat.py'
Dec 03 21:14:29 compute-0 sudo[117202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:29 compute-0 sudo[117205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:14:29 compute-0 sudo[117205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:29 compute-0 sudo[117205]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:29 compute-0 sudo[117230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:14:29 compute-0 sudo[117230]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:29 compute-0 python3.9[117204]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:29 compute-0 sudo[117202]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:29 compute-0 ceph-mon[75204]: pgmap v237: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:29 compute-0 sudo[117343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhdntevbjeuadxynwmgeqxydrdgvkxcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796468.985677-272-154306420359871/AnsiballZ_file.py'
Dec 03 21:14:29 compute-0 sudo[117343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:29 compute-0 python3.9[117346]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:29 compute-0 sudo[117230]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:14:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:14:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:14:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:14:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:14:29 compute-0 sudo[117343]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:14:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:14:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:14:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:14:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:14:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:14:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:14:30 compute-0 sudo[117363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:14:30 compute-0 sudo[117363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:30 compute-0 sudo[117363]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:30 compute-0 sudo[117412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:14:30 compute-0 sudo[117412]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:30 compute-0 podman[117501]: 2025-12-03 21:14:30.378031828 +0000 UTC m=+0.027910311 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:14:30 compute-0 sudo[117588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrjquiyqcvzivtmaqeorrwtdzrveajog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796470.1630843-285-281313458477579/AnsiballZ_command.py'
Dec 03 21:14:30 compute-0 sudo[117588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:30 compute-0 python3.9[117590]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:14:30 compute-0 sudo[117588]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:31 compute-0 podman[117501]: 2025-12-03 21:14:31.036139876 +0000 UTC m=+0.686018359 container create 72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_carver, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:14:31 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:14:31 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:14:31 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:14:31 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:14:31 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:14:31 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:14:31 compute-0 systemd[1]: Started libpod-conmon-72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b.scope.
Dec 03 21:14:31 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:14:31 compute-0 podman[117501]: 2025-12-03 21:14:31.179035233 +0000 UTC m=+0.828913736 container init 72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_carver, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 03 21:14:31 compute-0 podman[117501]: 2025-12-03 21:14:31.190307199 +0000 UTC m=+0.840185682 container start 72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_carver, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:14:31 compute-0 romantic_carver[117670]: 167 167
Dec 03 21:14:31 compute-0 podman[117501]: 2025-12-03 21:14:31.194701819 +0000 UTC m=+0.844580372 container attach 72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_carver, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:14:31 compute-0 systemd[1]: libpod-72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b.scope: Deactivated successfully.
Dec 03 21:14:31 compute-0 podman[117501]: 2025-12-03 21:14:31.196396295 +0000 UTC m=+0.846274818 container died 72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_carver, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:14:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-e27a0d0b4b65d96262889fceed5aad13e678fc7e6642b0d16636335ca7736fcf-merged.mount: Deactivated successfully.
Dec 03 21:14:31 compute-0 podman[117501]: 2025-12-03 21:14:31.24548111 +0000 UTC m=+0.895359553 container remove 72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_carver, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:14:31 compute-0 systemd[1]: libpod-conmon-72a7841b0f36710d63d6788970562f298be7c9e31a5064a63831afc4ff07151b.scope: Deactivated successfully.
Dec 03 21:14:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v238: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:31 compute-0 podman[117717]: 2025-12-03 21:14:31.442883838 +0000 UTC m=+0.063061125 container create ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=clever_greider, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:14:31 compute-0 systemd[1]: Started libpod-conmon-ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e.scope.
Dec 03 21:14:31 compute-0 podman[117717]: 2025-12-03 21:14:31.420463849 +0000 UTC m=+0.040641166 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:14:31 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fed73fb47b181cd95fd48c0dfe09f2f6c906b3fee2203cdc98c434e8469c4e22/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fed73fb47b181cd95fd48c0dfe09f2f6c906b3fee2203cdc98c434e8469c4e22/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fed73fb47b181cd95fd48c0dfe09f2f6c906b3fee2203cdc98c434e8469c4e22/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fed73fb47b181cd95fd48c0dfe09f2f6c906b3fee2203cdc98c434e8469c4e22/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fed73fb47b181cd95fd48c0dfe09f2f6c906b3fee2203cdc98c434e8469c4e22/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:31 compute-0 sudo[117787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylosrqxldueoygyuqozqpchlyaxitqzz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764796471.0203977-293-72719944395127/AnsiballZ_edpm_nftables_from_files.py'
Dec 03 21:14:31 compute-0 sudo[117787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:31 compute-0 podman[117717]: 2025-12-03 21:14:31.53079714 +0000 UTC m=+0.150974427 container init ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=clever_greider, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:14:31 compute-0 podman[117717]: 2025-12-03 21:14:31.539082955 +0000 UTC m=+0.159260222 container start ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=clever_greider, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:14:31 compute-0 podman[117717]: 2025-12-03 21:14:31.54332433 +0000 UTC m=+0.163501627 container attach ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=clever_greider, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle)
Dec 03 21:14:31 compute-0 python3[117789]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 03 21:14:31 compute-0 sudo[117787]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:31 compute-0 clever_greider[117779]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:14:31 compute-0 clever_greider[117779]: --> All data devices are unavailable
Dec 03 21:14:32 compute-0 systemd[1]: libpod-ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e.scope: Deactivated successfully.
Dec 03 21:14:32 compute-0 podman[117717]: 2025-12-03 21:14:32.042239329 +0000 UTC m=+0.662416636 container died ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=clever_greider, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:14:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-fed73fb47b181cd95fd48c0dfe09f2f6c906b3fee2203cdc98c434e8469c4e22-merged.mount: Deactivated successfully.
Dec 03 21:14:32 compute-0 podman[117717]: 2025-12-03 21:14:32.09628817 +0000 UTC m=+0.716465467 container remove ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=clever_greider, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:14:32 compute-0 systemd[1]: libpod-conmon-ad1e0b2da5b95170826759fb2f58fd4ac344ae70b864378eac097c7bd35f4a7e.scope: Deactivated successfully.
Dec 03 21:14:32 compute-0 ceph-mon[75204]: pgmap v238: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:32 compute-0 sudo[117412]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:32 compute-0 sudo[117895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:14:32 compute-0 sudo[117895]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:32 compute-0 sudo[117895]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:32 compute-0 sudo[117933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:14:32 compute-0 sudo[117933]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:32 compute-0 sudo[118018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrnvkwfljhrnmwwulqeskxuksflkymzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796472.0206988-301-247601842357528/AnsiballZ_stat.py'
Dec 03 21:14:32 compute-0 sudo[118018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:32 compute-0 podman[118033]: 2025-12-03 21:14:32.606375112 +0000 UTC m=+0.069362557 container create 7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_solomon, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:14:32 compute-0 python3.9[118020]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:32 compute-0 systemd[1]: Started libpod-conmon-7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da.scope.
Dec 03 21:14:32 compute-0 podman[118033]: 2025-12-03 21:14:32.576366286 +0000 UTC m=+0.039353781 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:14:32 compute-0 sudo[118018]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:32 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:14:32 compute-0 podman[118033]: 2025-12-03 21:14:32.717284408 +0000 UTC m=+0.180271813 container init 7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_solomon, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default)
Dec 03 21:14:32 compute-0 podman[118033]: 2025-12-03 21:14:32.729227204 +0000 UTC m=+0.192214649 container start 7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_solomon, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:14:32 compute-0 silly_solomon[118052]: 167 167
Dec 03 21:14:32 compute-0 podman[118033]: 2025-12-03 21:14:32.733440328 +0000 UTC m=+0.196427733 container attach 7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_solomon, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:14:32 compute-0 systemd[1]: libpod-7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da.scope: Deactivated successfully.
Dec 03 21:14:32 compute-0 podman[118033]: 2025-12-03 21:14:32.734548449 +0000 UTC m=+0.197535854 container died 7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:14:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-bf4f6b216ef3ba2c187af50845cc8de32618126f629069858004a31c4f6f6a57-merged.mount: Deactivated successfully.
Dec 03 21:14:32 compute-0 podman[118033]: 2025-12-03 21:14:32.77028134 +0000 UTC m=+0.233268755 container remove 7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_solomon, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:14:32 compute-0 systemd[1]: libpod-conmon-7f0ccf942c6e1f9c11204cdd3fb56e103f6a6fddbc9d7d1717b039a2817f69da.scope: Deactivated successfully.
Dec 03 21:14:32 compute-0 sudo[118159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tihrwweqblsyxecmvljudqdvjmnciaai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796472.0206988-301-247601842357528/AnsiballZ_file.py'
Dec 03 21:14:32 compute-0 sudo[118159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:32 compute-0 podman[118124]: 2025-12-03 21:14:32.928297698 +0000 UTC m=+0.052838738 container create e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_kirch, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:14:32 compute-0 systemd[1]: Started libpod-conmon-e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0.scope.
Dec 03 21:14:32 compute-0 podman[118124]: 2025-12-03 21:14:32.903157934 +0000 UTC m=+0.027698874 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:14:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:14:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0598e5d79eb2b2105863011169406716bdb438d2f14acf19470f4375b59b9a4a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0598e5d79eb2b2105863011169406716bdb438d2f14acf19470f4375b59b9a4a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0598e5d79eb2b2105863011169406716bdb438d2f14acf19470f4375b59b9a4a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0598e5d79eb2b2105863011169406716bdb438d2f14acf19470f4375b59b9a4a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:33 compute-0 podman[118124]: 2025-12-03 21:14:33.026817047 +0000 UTC m=+0.151358047 container init e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:14:33 compute-0 podman[118124]: 2025-12-03 21:14:33.039499241 +0000 UTC m=+0.164040181 container start e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_kirch, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:14:33 compute-0 podman[118124]: 2025-12-03 21:14:33.049268668 +0000 UTC m=+0.173809578 container attach e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_kirch, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030)
Dec 03 21:14:33 compute-0 python3.9[118163]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:33 compute-0 sudo[118159]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v239: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:33 compute-0 musing_kirch[118166]: {
Dec 03 21:14:33 compute-0 musing_kirch[118166]:     "0": [
Dec 03 21:14:33 compute-0 musing_kirch[118166]:         {
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "devices": [
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "/dev/loop3"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             ],
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_name": "ceph_lv0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_size": "21470642176",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "name": "ceph_lv0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "tags": {
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cluster_name": "ceph",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.crush_device_class": "",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.encrypted": "0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.objectstore": "bluestore",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osd_id": "0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.type": "block",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.vdo": "0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.with_tpm": "0"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             },
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "type": "block",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "vg_name": "ceph_vg0"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:         }
Dec 03 21:14:33 compute-0 musing_kirch[118166]:     ],
Dec 03 21:14:33 compute-0 musing_kirch[118166]:     "1": [
Dec 03 21:14:33 compute-0 musing_kirch[118166]:         {
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "devices": [
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "/dev/loop4"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             ],
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_name": "ceph_lv1",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_size": "21470642176",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "name": "ceph_lv1",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "tags": {
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cluster_name": "ceph",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.crush_device_class": "",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.encrypted": "0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.objectstore": "bluestore",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osd_id": "1",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.type": "block",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.vdo": "0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.with_tpm": "0"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             },
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "type": "block",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "vg_name": "ceph_vg1"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:         }
Dec 03 21:14:33 compute-0 musing_kirch[118166]:     ],
Dec 03 21:14:33 compute-0 musing_kirch[118166]:     "2": [
Dec 03 21:14:33 compute-0 musing_kirch[118166]:         {
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "devices": [
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "/dev/loop5"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             ],
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_name": "ceph_lv2",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_size": "21470642176",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "name": "ceph_lv2",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "tags": {
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.cluster_name": "ceph",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.crush_device_class": "",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.encrypted": "0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.objectstore": "bluestore",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osd_id": "2",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.type": "block",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.vdo": "0",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:                 "ceph.with_tpm": "0"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             },
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "type": "block",
Dec 03 21:14:33 compute-0 musing_kirch[118166]:             "vg_name": "ceph_vg2"
Dec 03 21:14:33 compute-0 musing_kirch[118166]:         }
Dec 03 21:14:33 compute-0 musing_kirch[118166]:     ]
Dec 03 21:14:33 compute-0 musing_kirch[118166]: }
Dec 03 21:14:33 compute-0 systemd[1]: libpod-e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0.scope: Deactivated successfully.
Dec 03 21:14:33 compute-0 podman[118124]: 2025-12-03 21:14:33.356910365 +0000 UTC m=+0.481451285 container died e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_kirch, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:14:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-0598e5d79eb2b2105863011169406716bdb438d2f14acf19470f4375b59b9a4a-merged.mount: Deactivated successfully.
Dec 03 21:14:33 compute-0 podman[118124]: 2025-12-03 21:14:33.396002208 +0000 UTC m=+0.520543118 container remove e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:14:33 compute-0 systemd[1]: libpod-conmon-e4f9da23c2f9f64c5c67e3ac3fb57c0ac7ec9a456cad56cee52a428b7fbe84b0.scope: Deactivated successfully.
Dec 03 21:14:33 compute-0 sudo[117933]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:33 compute-0 sudo[118293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:14:33 compute-0 sudo[118293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:33 compute-0 sudo[118293]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:33 compute-0 sudo[118379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjcujjlncncftnmnovwotfssyroppkgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796473.2615194-313-3714455352861/AnsiballZ_stat.py'
Dec 03 21:14:33 compute-0 sudo[118379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:33 compute-0 sudo[118343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:14:33 compute-0 sudo[118343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:33 compute-0 python3.9[118386]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:33 compute-0 sudo[118379]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:33 compute-0 podman[118404]: 2025-12-03 21:14:33.866639198 +0000 UTC m=+0.046961588 container create 640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_carver, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:14:33 compute-0 systemd[1]: Started libpod-conmon-640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9.scope.
Dec 03 21:14:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:14:33 compute-0 podman[118404]: 2025-12-03 21:14:33.840400584 +0000 UTC m=+0.020723034 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:14:33 compute-0 podman[118404]: 2025-12-03 21:14:33.943878788 +0000 UTC m=+0.124201228 container init 640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_carver, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:14:33 compute-0 podman[118404]: 2025-12-03 21:14:33.950619012 +0000 UTC m=+0.130941402 container start 640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_carver, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:14:33 compute-0 frosty_carver[118443]: 167 167
Dec 03 21:14:33 compute-0 systemd[1]: libpod-640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9.scope: Deactivated successfully.
Dec 03 21:14:33 compute-0 podman[118404]: 2025-12-03 21:14:33.988695267 +0000 UTC m=+0.169017697 container attach 640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_carver, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:14:33 compute-0 podman[118404]: 2025-12-03 21:14:33.989218291 +0000 UTC m=+0.169540711 container died 640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_carver, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:14:34 compute-0 sudo[118509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcwwaustyxwhlvomklecfiltpywyvadd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796473.2615194-313-3714455352861/AnsiballZ_file.py'
Dec 03 21:14:34 compute-0 sudo[118509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:34 compute-0 python3.9[118511]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:34 compute-0 sudo[118509]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-4321be52e9e7caafd66a159d49fff1e1d552ca82ee1600f392dee7c05ae3f7d0-merged.mount: Deactivated successfully.
Dec 03 21:14:34 compute-0 ceph-mon[75204]: pgmap v239: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:34 compute-0 podman[118404]: 2025-12-03 21:14:34.335863289 +0000 UTC m=+0.516185669 container remove 640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_carver, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:14:34 compute-0 systemd[1]: libpod-conmon-640ec46ac95a9c31a7e89efbacb0c19ca9337302398eaceee4079f5ce8297ff9.scope: Deactivated successfully.
Dec 03 21:14:34 compute-0 podman[118569]: 2025-12-03 21:14:34.501990567 +0000 UTC m=+0.046290720 container create 933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_hoover, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 03 21:14:34 compute-0 systemd[1]: Started libpod-conmon-933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb.scope.
Dec 03 21:14:34 compute-0 podman[118569]: 2025-12-03 21:14:34.480146044 +0000 UTC m=+0.024446287 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:14:34 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:14:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e28c14796f19487aa2876bae9f6b99411d0208e4b0c6dec01f7f3b30813e787e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e28c14796f19487aa2876bae9f6b99411d0208e4b0c6dec01f7f3b30813e787e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e28c14796f19487aa2876bae9f6b99411d0208e4b0c6dec01f7f3b30813e787e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e28c14796f19487aa2876bae9f6b99411d0208e4b0c6dec01f7f3b30813e787e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:14:34 compute-0 podman[118569]: 2025-12-03 21:14:34.602209333 +0000 UTC m=+0.146509596 container init 933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_hoover, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:14:34 compute-0 podman[118569]: 2025-12-03 21:14:34.610437967 +0000 UTC m=+0.154738130 container start 933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_hoover, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:14:34 compute-0 podman[118569]: 2025-12-03 21:14:34.614297832 +0000 UTC m=+0.158598035 container attach 933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_hoover, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:14:34 compute-0 sudo[118692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mokvdyyvnqjgpraaccxtbmuqavsorapi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796474.4486787-325-138621026023638/AnsiballZ_stat.py'
Dec 03 21:14:34 compute-0 sudo[118692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:35 compute-0 python3.9[118699]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:35 compute-0 sudo[118692]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v240: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:35 compute-0 sudo[118840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkkgnnskgnflltfrqprrdomyapqutyqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796474.4486787-325-138621026023638/AnsiballZ_file.py'
Dec 03 21:14:35 compute-0 sudo[118840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:35 compute-0 lvm[118844]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:14:35 compute-0 lvm[118844]: VG ceph_vg0 finished
Dec 03 21:14:35 compute-0 lvm[118847]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:14:35 compute-0 lvm[118847]: VG ceph_vg1 finished
Dec 03 21:14:35 compute-0 lvm[118849]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:14:35 compute-0 lvm[118849]: VG ceph_vg2 finished
Dec 03 21:14:35 compute-0 beautiful_hoover[118614]: {}
Dec 03 21:14:35 compute-0 systemd[1]: libpod-933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb.scope: Deactivated successfully.
Dec 03 21:14:35 compute-0 systemd[1]: libpod-933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb.scope: Consumed 1.303s CPU time.
Dec 03 21:14:35 compute-0 podman[118569]: 2025-12-03 21:14:35.440121931 +0000 UTC m=+0.984422124 container died 933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_hoover, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0)
Dec 03 21:14:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-e28c14796f19487aa2876bae9f6b99411d0208e4b0c6dec01f7f3b30813e787e-merged.mount: Deactivated successfully.
Dec 03 21:14:35 compute-0 python3.9[118842]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:35 compute-0 podman[118569]: 2025-12-03 21:14:35.484884289 +0000 UTC m=+1.029184462 container remove 933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_hoover, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:14:35 compute-0 systemd[1]: libpod-conmon-933979592f23e32b26f05f64a5a270b81385af41d3118f7559e7dd6e83c10cdb.scope: Deactivated successfully.
Dec 03 21:14:35 compute-0 sudo[118343]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:35 compute-0 sudo[118840]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:14:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:14:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:14:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:14:35 compute-0 sudo[118868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:14:35 compute-0 sudo[118868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:14:35 compute-0 sudo[118868]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:36 compute-0 sudo[119038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wghbrcvnsljmfdjkuvhvleqxmrxifuhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796475.6714044-337-98756273483362/AnsiballZ_stat.py'
Dec 03 21:14:36 compute-0 sudo[119038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:36 compute-0 python3.9[119040]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:36 compute-0 sudo[119038]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:36 compute-0 sudo[119116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oljkmjitgeogjtrppppovmvjwocpclno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796475.6714044-337-98756273483362/AnsiballZ_file.py'
Dec 03 21:14:36 compute-0 sudo[119116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:36 compute-0 ceph-mon[75204]: pgmap v240: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:36 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:14:36 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:14:36 compute-0 python3.9[119118]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:36 compute-0 sudo[119116]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v241: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:37 compute-0 sudo[119268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udqtvgccdcevulbaxranqmeowfiswhqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796476.9503891-349-144362496265815/AnsiballZ_stat.py'
Dec 03 21:14:37 compute-0 sudo[119268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:37 compute-0 python3.9[119270]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:37 compute-0 sudo[119268]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:37 compute-0 sudo[119346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fczelofdfdcrwrxmantzcxidyhiouhea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796476.9503891-349-144362496265815/AnsiballZ_file.py'
Dec 03 21:14:37 compute-0 sudo[119346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:38 compute-0 python3.9[119348]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:38 compute-0 sudo[119346]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:38 compute-0 sudo[119498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylueolaqmppesaulyreoyxnchkrftmec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796478.2502706-362-99351106958677/AnsiballZ_command.py'
Dec 03 21:14:38 compute-0 sudo[119498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:38 compute-0 ceph-mon[75204]: pgmap v241: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:38 compute-0 python3.9[119500]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:14:38 compute-0 sudo[119498]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v242: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:39 compute-0 sudo[119653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frzhaweniffbipdsgdedkrjfwqjbbeun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796479.0188324-370-45526507382582/AnsiballZ_blockinfile.py'
Dec 03 21:14:39 compute-0 sudo[119653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:39 compute-0 python3.9[119655]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:39 compute-0 sudo[119653]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:40 compute-0 sudo[119805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cihcnumnjszovdblhbhylpatuqehfbsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796479.9334605-379-278231102909282/AnsiballZ_file.py'
Dec 03 21:14:40 compute-0 sudo[119805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:40 compute-0 python3.9[119807]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:40 compute-0 sudo[119805]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:40 compute-0 ceph-mon[75204]: pgmap v242: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:41 compute-0 sudo[119957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjcmucdyccgagkmlkhwrhahgzqdmiyzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796480.6781123-379-271442599052808/AnsiballZ_file.py'
Dec 03 21:14:41 compute-0 sudo[119957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:41 compute-0 python3.9[119959]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v243: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:41 compute-0 sudo[119957]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:41 compute-0 sudo[120109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpinargagqibjzdsofgumzikyfkuvchh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796481.4740646-394-4060174482432/AnsiballZ_mount.py'
Dec 03 21:14:41 compute-0 sudo[120109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:42 compute-0 python3.9[120111]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 03 21:14:42 compute-0 sudo[120109]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:42 compute-0 ceph-mon[75204]: pgmap v243: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:42 compute-0 sudo[120261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-japhednegukqoaxrechgbzwhjtwgsnuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796482.4195268-394-102738233465453/AnsiballZ_mount.py'
Dec 03 21:14:42 compute-0 sudo[120261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:42 compute-0 python3.9[120263]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 03 21:14:42 compute-0 sudo[120261]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v244: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:43 compute-0 sshd-session[112532]: Connection closed by 192.168.122.30 port 54120
Dec 03 21:14:43 compute-0 sshd-session[112529]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:14:43 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Dec 03 21:14:43 compute-0 systemd[1]: session-39.scope: Consumed 32.519s CPU time.
Dec 03 21:14:43 compute-0 systemd-logind[787]: Session 39 logged out. Waiting for processes to exit.
Dec 03 21:14:43 compute-0 systemd-logind[787]: Removed session 39.
Dec 03 21:14:43 compute-0 ceph-mon[75204]: pgmap v244: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v245: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:46 compute-0 ceph-mon[75204]: pgmap v245: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v246: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:48 compute-0 ceph-mon[75204]: pgmap v246: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:49 compute-0 sshd-session[120289]: Accepted publickey for zuul from 192.168.122.30 port 52412 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:14:49 compute-0 systemd-logind[787]: New session 40 of user zuul.
Dec 03 21:14:49 compute-0 systemd[1]: Started Session 40 of User zuul.
Dec 03 21:14:49 compute-0 sshd-session[120289]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:14:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v247: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:49 compute-0 sudo[120442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpxbjhkbkvgyioihkceidqkdpymhxlbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796489.176407-16-155167101282116/AnsiballZ_tempfile.py'
Dec 03 21:14:49 compute-0 sudo[120442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:49 compute-0 python3.9[120444]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 03 21:14:49 compute-0 sudo[120442]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:50 compute-0 ceph-mon[75204]: pgmap v247: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:50 compute-0 sudo[120594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olvbenfphwwebyokazmryxmlzxnybhdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796490.1554203-28-186007651378839/AnsiballZ_stat.py'
Dec 03 21:14:50 compute-0 sudo[120594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:50 compute-0 python3.9[120596]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:14:50 compute-0 sudo[120594]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v248: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:51 compute-0 sudo[120748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqxytakmbefcsyarfkardaahcwngvuva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796491.0289671-36-221121172409092/AnsiballZ_slurp.py'
Dec 03 21:14:51 compute-0 sudo[120748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:51 compute-0 python3.9[120750]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec 03 21:14:51 compute-0 sudo[120748]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:14:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:14:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:14:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:14:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:14:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:14:52 compute-0 sudo[120900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xklsrdnburlxedwhuorghwgvjbxeqosu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796491.89249-44-159480534895363/AnsiballZ_stat.py'
Dec 03 21:14:52 compute-0 sudo[120900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:52 compute-0 ceph-mon[75204]: pgmap v248: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:52 compute-0 python3.9[120902]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.1obbp56s follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:14:52 compute-0 sudo[120900]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:53 compute-0 sudo[121025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjtaebhhnranqhduhiumsgogjjyglgpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796491.89249-44-159480534895363/AnsiballZ_copy.py'
Dec 03 21:14:53 compute-0 sudo[121025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v249: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:53 compute-0 python3.9[121027]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.1obbp56s mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764796491.89249-44-159480534895363/.source.1obbp56s _original_basename=.kguit_0n follow=False checksum=61f24f0023f825930bc81e128bb40f917e4e4dde backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:53 compute-0 sudo[121025]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:54 compute-0 sudo[121177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spzteatrujzwwfoboapwmcmvqnvpbbqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796493.5019062-59-257664180494008/AnsiballZ_setup.py'
Dec 03 21:14:54 compute-0 sudo[121177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:54 compute-0 ceph-mon[75204]: pgmap v249: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:54 compute-0 python3.9[121179]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:14:54 compute-0 sudo[121177]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:55 compute-0 sudo[121329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyxfeeabobmvgowntsucedxgaqmncjwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796494.6796896-68-76455156519655/AnsiballZ_blockinfile.py'
Dec 03 21:14:55 compute-0 sudo[121329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v250: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:55 compute-0 python3.9[121331]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/QcnFnE07R2H02WXa+53W3W+nwsFsC4YoQpDZUgxEwlg4f2zQf8fQIG23b5h9N8ej11I+FwfST4eb14wdXsFBAm6rVbCzkwQOmaDc1DdRfSmSFzwYKgqnejjeunc7W9ASRY8ZFAX/dexoruuzsoDFSnT/YK2DiUDLCoWmwO4mZ946GvsVF6yCywprEQo/oFdVyYbYBvGnl2hb9O06ePH8wQRx2BT7GKvzyv0j8Dz3LjXOzrd+jB7UlvodWIaHPlQhq/S/ZDfA640mfL7TSk/VRKvnWyi4m3+Gbj0A92cO36Objq1V2W1DPen5Nzv5CbZRHNjBvVR9G0jGLdsP8sWtUhe2qfiLZlAx0Cn0ZIhzPbS2Ij3lgp1Otug1NK15JYpiz9z0JO+UgfdZ9ht6yAYnsMcQ4OaFvKqWmsOxrx76BJ8s3hQuBMrZL+YgtbDswJVFn9/ay22MQ+ntCLeQL6GPb6WQJGnnWYqSlUX3e8wBllkbHrFK1/iyfqWjrHwteK8=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINtkoZCmFpb3z8TzbldoOvjALaFBxUWmFrtA4oHE040r
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAzVjP1T+0nWOYuc0KdOyqtmhcGoQseIckbkxVi0stL4dfIoBsNFyujIS49nno21BKZJb6EV/fwil4CuPgbMlGg=
                                              create=True mode=0644 path=/tmp/ansible.1obbp56s state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:55 compute-0 sudo[121329]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:55 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 03 21:14:55 compute-0 sudo[121483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xndvhhujksqixjufsqlwwzbjtsxlzarh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796495.529648-76-216610620377424/AnsiballZ_command.py'
Dec 03 21:14:55 compute-0 sudo[121483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:56 compute-0 python3.9[121485]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.1obbp56s' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:14:56 compute-0 sudo[121483]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:56 compute-0 ceph-mon[75204]: pgmap v250: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:56 compute-0 sudo[121637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsddcmqiyduwnqujrdqsmyllwdqxmkqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796496.3340662-84-165717888502731/AnsiballZ_file.py'
Dec 03 21:14:56 compute-0 sudo[121637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:14:57 compute-0 python3.9[121639]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.1obbp56s state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:14:57 compute-0 sudo[121637]: pam_unix(sudo:session): session closed for user root
Dec 03 21:14:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v251: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:57 compute-0 sshd-session[120292]: Connection closed by 192.168.122.30 port 52412
Dec 03 21:14:57 compute-0 sshd-session[120289]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:14:57 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Dec 03 21:14:57 compute-0 systemd[1]: session-40.scope: Consumed 5.610s CPU time.
Dec 03 21:14:57 compute-0 systemd-logind[787]: Session 40 logged out. Waiting for processes to exit.
Dec 03 21:14:57 compute-0 systemd-logind[787]: Removed session 40.
Dec 03 21:14:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:14:58 compute-0 ceph-mon[75204]: pgmap v251: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:14:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v252: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:00 compute-0 ceph-mon[75204]: pgmap v252: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v253: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:02 compute-0 ceph-mon[75204]: pgmap v253: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:03 compute-0 sshd-session[121664]: Accepted publickey for zuul from 192.168.122.30 port 50728 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:15:03 compute-0 systemd-logind[787]: New session 41 of user zuul.
Dec 03 21:15:03 compute-0 systemd[1]: Started Session 41 of User zuul.
Dec 03 21:15:03 compute-0 sshd-session[121664]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:15:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v254: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:04 compute-0 python3.9[121817]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:15:04 compute-0 ceph-mon[75204]: pgmap v254: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v255: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:05 compute-0 sudo[121971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ievgacxhiprshlcyhhibxwgqxyvometi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796504.6387513-32-90793187383720/AnsiballZ_systemd.py'
Dec 03 21:15:05 compute-0 sudo[121971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:05 compute-0 python3.9[121973]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 03 21:15:05 compute-0 sudo[121971]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:06 compute-0 sudo[122125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tngsiwljybquonmjowkzwsqwnpyhffah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796505.9214828-40-122773861496183/AnsiballZ_systemd.py'
Dec 03 21:15:06 compute-0 sudo[122125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:06 compute-0 ceph-mon[75204]: pgmap v255: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:06 compute-0 python3.9[122127]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:15:06 compute-0 sudo[122125]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v256: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:07 compute-0 sudo[122278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-webtbgzysfhknoxpyifeajdyduvdsavd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796506.9185824-49-103525364718166/AnsiballZ_command.py'
Dec 03 21:15:07 compute-0 sudo[122278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:07 compute-0 python3.9[122280]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:15:07 compute-0 sudo[122278]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:08 compute-0 sudo[122431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywbprptsebaekxfqqtsyinsxdxavznyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796507.824856-57-119630823158056/AnsiballZ_stat.py'
Dec 03 21:15:08 compute-0 sudo[122431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:08 compute-0 ceph-mon[75204]: pgmap v256: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:08 compute-0 python3.9[122433]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:15:08 compute-0 sudo[122431]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:09 compute-0 sudo[122583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymeduzpgdtfjdeacehkdxteefaloixxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796508.7195542-66-63980447787112/AnsiballZ_file.py'
Dec 03 21:15:09 compute-0 sudo[122583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v257: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:09 compute-0 python3.9[122585]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:09 compute-0 sudo[122583]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:09 compute-0 ceph-mon[75204]: pgmap v257: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:09 compute-0 sshd-session[121667]: Connection closed by 192.168.122.30 port 50728
Dec 03 21:15:09 compute-0 sshd-session[121664]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:15:09 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Dec 03 21:15:09 compute-0 systemd[1]: session-41.scope: Consumed 4.352s CPU time.
Dec 03 21:15:09 compute-0 systemd-logind[787]: Session 41 logged out. Waiting for processes to exit.
Dec 03 21:15:09 compute-0 systemd-logind[787]: Removed session 41.
Dec 03 21:15:10 compute-0 sshd-session[71357]: Received disconnect from 38.102.83.47 port 55182:11: disconnected by user
Dec 03 21:15:10 compute-0 sshd-session[71357]: Disconnected from user zuul 38.102.83.47 port 55182
Dec 03 21:15:10 compute-0 sshd-session[71354]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:15:10 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Dec 03 21:15:10 compute-0 systemd[1]: session-17.scope: Consumed 1min 56.543s CPU time.
Dec 03 21:15:10 compute-0 systemd-logind[787]: Session 17 logged out. Waiting for processes to exit.
Dec 03 21:15:10 compute-0 systemd-logind[787]: Removed session 17.
Dec 03 21:15:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v258: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:12 compute-0 ceph-mon[75204]: pgmap v258: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v259: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:14 compute-0 ceph-mon[75204]: pgmap v259: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v260: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:15 compute-0 sshd-session[122611]: Accepted publickey for zuul from 192.168.122.30 port 45440 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:15:15 compute-0 systemd-logind[787]: New session 42 of user zuul.
Dec 03 21:15:15 compute-0 systemd[1]: Started Session 42 of User zuul.
Dec 03 21:15:15 compute-0 sshd-session[122611]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:15:16 compute-0 ceph-mon[75204]: pgmap v260: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:16 compute-0 python3.9[122764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:15:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v261: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:17 compute-0 sudo[122918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzbpvflbbanmzogtdhmuvleobzhysrxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796517.1508768-34-186800182819615/AnsiballZ_setup.py'
Dec 03 21:15:17 compute-0 sudo[122918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:17 compute-0 python3.9[122920]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:15:17 compute-0 sudo[122918]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:18 compute-0 sudo[123002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvwnogsetuedhfpetyatouvszslkbiag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796517.1508768-34-186800182819615/AnsiballZ_dnf.py'
Dec 03 21:15:18 compute-0 sudo[123002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:18 compute-0 ceph-mon[75204]: pgmap v261: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:18 compute-0 python3.9[123004]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 03 21:15:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v262: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:19 compute-0 sudo[123002]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:20 compute-0 ceph-mon[75204]: pgmap v262: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:20 compute-0 python3.9[123155]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:15:21
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', 'images', 'volumes', '.mgr', 'vms', 'cephfs.cephfs.data']
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v263: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:15:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:15:22 compute-0 python3.9[123306]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 03 21:15:22 compute-0 ceph-mon[75204]: pgmap v263: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:23 compute-0 python3.9[123456]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:15:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v264: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:23 compute-0 python3.9[123606]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:15:24 compute-0 sshd-session[122614]: Connection closed by 192.168.122.30 port 45440
Dec 03 21:15:24 compute-0 sshd-session[122611]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:15:24 compute-0 systemd[1]: session-42.scope: Deactivated successfully.
Dec 03 21:15:24 compute-0 systemd[1]: session-42.scope: Consumed 6.358s CPU time.
Dec 03 21:15:24 compute-0 systemd-logind[787]: Session 42 logged out. Waiting for processes to exit.
Dec 03 21:15:24 compute-0 systemd-logind[787]: Removed session 42.
Dec 03 21:15:24 compute-0 ceph-mon[75204]: pgmap v264: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v265: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:26 compute-0 ceph-mon[75204]: pgmap v265: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v266: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:15:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:15:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:28 compute-0 ceph-mon[75204]: pgmap v266: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v267: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:29 compute-0 sshd-session[123631]: Accepted publickey for zuul from 192.168.122.30 port 41146 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:15:29 compute-0 systemd-logind[787]: New session 43 of user zuul.
Dec 03 21:15:29 compute-0 systemd[1]: Started Session 43 of User zuul.
Dec 03 21:15:29 compute-0 sshd-session[123631]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:15:30 compute-0 ceph-mon[75204]: pgmap v267: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:30 compute-0 python3.9[123784]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:15:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v268: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:32 compute-0 ceph-mon[75204]: pgmap v268: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:32 compute-0 sudo[123938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dstizclanubkbhkixdbxaohczpgxdbmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796532.150532-50-134454888075408/AnsiballZ_file.py'
Dec 03 21:15:32 compute-0 sudo[123938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:32 compute-0 python3.9[123940]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:32 compute-0 sudo[123938]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v269: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:33 compute-0 sudo[124090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcvrvuuuqtncwqjhfhaoctopoajadgvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796533.012537-50-22874374299639/AnsiballZ_file.py'
Dec 03 21:15:33 compute-0 sudo[124090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:33 compute-0 python3.9[124092]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:33 compute-0 sudo[124090]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:34 compute-0 sudo[124242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkzimdesavebitcytqkxaukwmcdcujyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796533.8330057-65-231674319160608/AnsiballZ_stat.py'
Dec 03 21:15:34 compute-0 sudo[124242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:34 compute-0 ceph-mon[75204]: pgmap v269: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:34 compute-0 python3.9[124244]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:34 compute-0 sudo[124242]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:35 compute-0 sudo[124365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtvbenlxzyufsjcjewxlcnxdwoupqjhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796533.8330057-65-231674319160608/AnsiballZ_copy.py'
Dec 03 21:15:35 compute-0 sudo[124365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v270: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:35 compute-0 python3.9[124367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796533.8330057-65-231674319160608/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=2c2471fe99a406a3fe04a82d1be92e23b4efda72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:35 compute-0 sudo[124365]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:35 compute-0 sudo[124445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:15:35 compute-0 sudo[124445]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:35 compute-0 sudo[124445]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:35 compute-0 sudo[124493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:15:35 compute-0 sudo[124493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:35 compute-0 sudo[124567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjuifezrdnqinmqfmxdvzryttdkuooeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796535.4772625-65-86842222149640/AnsiballZ_stat.py'
Dec 03 21:15:35 compute-0 sudo[124567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:36 compute-0 python3.9[124569]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:36 compute-0 sudo[124567]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:36 compute-0 sudo[124493]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:36 compute-0 ceph-mon[75204]: pgmap v270: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:15:36 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:15:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:15:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:15:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:15:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:15:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:15:36 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:15:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:15:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:15:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:15:36 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:15:36 compute-0 sudo[124696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:15:36 compute-0 sudo[124696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:36 compute-0 sudo[124696]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:36 compute-0 sudo[124753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvvtxeclajnsopgegrhzpabqcbgjxkqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796535.4772625-65-86842222149640/AnsiballZ_copy.py'
Dec 03 21:15:36 compute-0 sudo[124753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:36 compute-0 sudo[124744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:15:36 compute-0 sudo[124744]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:36 compute-0 python3.9[124772]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796535.4772625-65-86842222149640/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=d3c60da42303b433c739f24f71a523db16d56769 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:36 compute-0 sudo[124753]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:36 compute-0 podman[124812]: 2025-12-03 21:15:36.942275152 +0000 UTC m=+0.055458590 container create 91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_mahavira, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, ceph=True)
Dec 03 21:15:36 compute-0 systemd[1]: Started libpod-conmon-91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39.scope.
Dec 03 21:15:37 compute-0 podman[124812]: 2025-12-03 21:15:36.91203267 +0000 UTC m=+0.025216178 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:15:37 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:15:37 compute-0 podman[124812]: 2025-12-03 21:15:37.030285694 +0000 UTC m=+0.143469212 container init 91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_mahavira, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:15:37 compute-0 podman[124812]: 2025-12-03 21:15:37.043623472 +0000 UTC m=+0.156806960 container start 91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_mahavira, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:15:37 compute-0 podman[124812]: 2025-12-03 21:15:37.04764518 +0000 UTC m=+0.160828738 container attach 91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_mahavira, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:15:37 compute-0 brave_mahavira[124852]: 167 167
Dec 03 21:15:37 compute-0 systemd[1]: libpod-91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39.scope: Deactivated successfully.
Dec 03 21:15:37 compute-0 conmon[124852]: conmon 91e7d6bb16744d80246d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39.scope/container/memory.events
Dec 03 21:15:37 compute-0 podman[124812]: 2025-12-03 21:15:37.052531171 +0000 UTC m=+0.165714649 container died 91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_mahavira, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:15:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-95563d99443bc586449cfe41b03db0356367d4cd536e3ca9e6d3cadb683f04b7-merged.mount: Deactivated successfully.
Dec 03 21:15:37 compute-0 podman[124812]: 2025-12-03 21:15:37.104747193 +0000 UTC m=+0.217930641 container remove 91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_mahavira, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:15:37 compute-0 systemd[1]: libpod-conmon-91e7d6bb16744d80246dbd2f0aaa8c1c97fb66c14386e3ba3915c86cdc0bab39.scope: Deactivated successfully.
Dec 03 21:15:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v271: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:37 compute-0 sudo[124991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqvfpnarigzxtjfjgditaibtkwgnvmjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796536.9543972-65-205376886054820/AnsiballZ_stat.py'
Dec 03 21:15:37 compute-0 sudo[124991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:37 compute-0 podman[124953]: 2025-12-03 21:15:37.336128444 +0000 UTC m=+0.077400908 container create 3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigorous_swartz, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle)
Dec 03 21:15:37 compute-0 systemd[1]: Started libpod-conmon-3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25.scope.
Dec 03 21:15:37 compute-0 podman[124953]: 2025-12-03 21:15:37.304897676 +0000 UTC m=+0.046170210 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:15:37 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:15:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997a270cff7cf78e8b5acec0a7321068277c39370ee088e001ea2c0e6da60240/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997a270cff7cf78e8b5acec0a7321068277c39370ee088e001ea2c0e6da60240/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997a270cff7cf78e8b5acec0a7321068277c39370ee088e001ea2c0e6da60240/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997a270cff7cf78e8b5acec0a7321068277c39370ee088e001ea2c0e6da60240/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/997a270cff7cf78e8b5acec0a7321068277c39370ee088e001ea2c0e6da60240/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:37 compute-0 podman[124953]: 2025-12-03 21:15:37.451123291 +0000 UTC m=+0.192395845 container init 3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigorous_swartz, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:15:37 compute-0 podman[124953]: 2025-12-03 21:15:37.462905997 +0000 UTC m=+0.204178491 container start 3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigorous_swartz, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:15:37 compute-0 podman[124953]: 2025-12-03 21:15:37.467108461 +0000 UTC m=+0.208381015 container attach 3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigorous_swartz, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:15:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:15:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:15:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:15:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:15:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:15:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:15:37 compute-0 python3.9[124995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:37 compute-0 sudo[124991]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:37 compute-0 sudo[125135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtwhzhpcuioxkoevsysygsroodeznfhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796536.9543972-65-205376886054820/AnsiballZ_copy.py'
Dec 03 21:15:37 compute-0 sudo[125135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:38 compute-0 vigorous_swartz[124998]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:15:38 compute-0 vigorous_swartz[124998]: --> All data devices are unavailable
Dec 03 21:15:38 compute-0 systemd[1]: libpod-3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25.scope: Deactivated successfully.
Dec 03 21:15:38 compute-0 podman[124953]: 2025-12-03 21:15:38.048085987 +0000 UTC m=+0.789358491 container died 3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigorous_swartz, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3)
Dec 03 21:15:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-997a270cff7cf78e8b5acec0a7321068277c39370ee088e001ea2c0e6da60240-merged.mount: Deactivated successfully.
Dec 03 21:15:38 compute-0 podman[124953]: 2025-12-03 21:15:38.102697173 +0000 UTC m=+0.843969627 container remove 3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigorous_swartz, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:15:38 compute-0 systemd[1]: libpod-conmon-3348b6cd8aab5da81a23dbfbeddaeafe45d245be0466ce20c35010d99b6f5e25.scope: Deactivated successfully.
Dec 03 21:15:38 compute-0 sudo[124744]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:38 compute-0 python3.9[125137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796536.9543972-65-205376886054820/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=149563241073772b4070777f145b125561375852 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:38 compute-0 sudo[125135]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:38 compute-0 sudo[125153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:15:38 compute-0 sudo[125153]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:38 compute-0 sudo[125153]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:38 compute-0 sudo[125178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:15:38 compute-0 sudo[125178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:38 compute-0 ceph-mon[75204]: pgmap v271: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:38 compute-0 podman[125241]: 2025-12-03 21:15:38.538105872 +0000 UTC m=+0.042558763 container create 6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_bassi, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:15:38 compute-0 systemd[1]: Started libpod-conmon-6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b.scope.
Dec 03 21:15:38 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:15:38 compute-0 podman[125241]: 2025-12-03 21:15:38.520584872 +0000 UTC m=+0.025037783 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:15:38 compute-0 podman[125241]: 2025-12-03 21:15:38.622926889 +0000 UTC m=+0.127379800 container init 6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_bassi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:15:38 compute-0 podman[125241]: 2025-12-03 21:15:38.631719896 +0000 UTC m=+0.136172787 container start 6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_bassi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:15:38 compute-0 podman[125241]: 2025-12-03 21:15:38.63561985 +0000 UTC m=+0.140072761 container attach 6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_bassi, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:15:38 compute-0 jovial_bassi[125300]: 167 167
Dec 03 21:15:38 compute-0 systemd[1]: libpod-6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b.scope: Deactivated successfully.
Dec 03 21:15:38 compute-0 podman[125241]: 2025-12-03 21:15:38.639995608 +0000 UTC m=+0.144448539 container died 6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_bassi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:15:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-a2590fce862e7648da334fc723722f2c11be61da9d529b42d05bb95821c0f85f-merged.mount: Deactivated successfully.
Dec 03 21:15:38 compute-0 podman[125241]: 2025-12-03 21:15:38.681886532 +0000 UTC m=+0.186339423 container remove 6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_bassi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:15:38 compute-0 systemd[1]: libpod-conmon-6136abf1c22dc9769745260ff6a2a8d41c2f55e6df85026c9e60a1023fc7503b.scope: Deactivated successfully.
Dec 03 21:15:38 compute-0 sudo[125407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nastfkatgrpchdcjwoewjbeaxgfaxdqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796538.5168855-109-250012438000903/AnsiballZ_file.py'
Dec 03 21:15:38 compute-0 sudo[125407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:38 compute-0 podman[125400]: 2025-12-03 21:15:38.879047365 +0000 UTC m=+0.049094169 container create 04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_mayer, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle)
Dec 03 21:15:38 compute-0 systemd[1]: Started libpod-conmon-04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c.scope.
Dec 03 21:15:38 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f37ff4e5fcfac082b753a855aeb449ebe375960c9769a9f0d135a55f0115b6a1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f37ff4e5fcfac082b753a855aeb449ebe375960c9769a9f0d135a55f0115b6a1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:38 compute-0 podman[125400]: 2025-12-03 21:15:38.859788868 +0000 UTC m=+0.029835702 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f37ff4e5fcfac082b753a855aeb449ebe375960c9769a9f0d135a55f0115b6a1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f37ff4e5fcfac082b753a855aeb449ebe375960c9769a9f0d135a55f0115b6a1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:38 compute-0 podman[125400]: 2025-12-03 21:15:38.965878566 +0000 UTC m=+0.135925370 container init 04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_mayer, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030)
Dec 03 21:15:38 compute-0 podman[125400]: 2025-12-03 21:15:38.972108853 +0000 UTC m=+0.142155677 container start 04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_mayer, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:15:38 compute-0 podman[125400]: 2025-12-03 21:15:38.975783971 +0000 UTC m=+0.145830775 container attach 04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_mayer, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:15:39 compute-0 python3.9[125418]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:39 compute-0 sudo[125407]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v272: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:39 compute-0 cool_mayer[125425]: {
Dec 03 21:15:39 compute-0 cool_mayer[125425]:     "0": [
Dec 03 21:15:39 compute-0 cool_mayer[125425]:         {
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "devices": [
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "/dev/loop3"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             ],
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_name": "ceph_lv0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_size": "21470642176",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "name": "ceph_lv0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "tags": {
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cluster_name": "ceph",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.crush_device_class": "",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.encrypted": "0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.objectstore": "bluestore",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osd_id": "0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.type": "block",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.vdo": "0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.with_tpm": "0"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             },
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "type": "block",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "vg_name": "ceph_vg0"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:         }
Dec 03 21:15:39 compute-0 cool_mayer[125425]:     ],
Dec 03 21:15:39 compute-0 cool_mayer[125425]:     "1": [
Dec 03 21:15:39 compute-0 cool_mayer[125425]:         {
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "devices": [
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "/dev/loop4"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             ],
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_name": "ceph_lv1",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_size": "21470642176",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "name": "ceph_lv1",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "tags": {
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cluster_name": "ceph",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.crush_device_class": "",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.encrypted": "0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.objectstore": "bluestore",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osd_id": "1",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.type": "block",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.vdo": "0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.with_tpm": "0"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             },
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "type": "block",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "vg_name": "ceph_vg1"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:         }
Dec 03 21:15:39 compute-0 cool_mayer[125425]:     ],
Dec 03 21:15:39 compute-0 cool_mayer[125425]:     "2": [
Dec 03 21:15:39 compute-0 cool_mayer[125425]:         {
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "devices": [
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "/dev/loop5"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             ],
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_name": "ceph_lv2",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_size": "21470642176",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "name": "ceph_lv2",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "tags": {
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.cluster_name": "ceph",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.crush_device_class": "",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.encrypted": "0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.objectstore": "bluestore",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osd_id": "2",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.type": "block",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.vdo": "0",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:                 "ceph.with_tpm": "0"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             },
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "type": "block",
Dec 03 21:15:39 compute-0 cool_mayer[125425]:             "vg_name": "ceph_vg2"
Dec 03 21:15:39 compute-0 cool_mayer[125425]:         }
Dec 03 21:15:39 compute-0 cool_mayer[125425]:     ]
Dec 03 21:15:39 compute-0 cool_mayer[125425]: }
Dec 03 21:15:39 compute-0 systemd[1]: libpod-04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c.scope: Deactivated successfully.
Dec 03 21:15:39 compute-0 podman[125400]: 2025-12-03 21:15:39.369210104 +0000 UTC m=+0.539256918 container died 04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_mayer, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:15:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-f37ff4e5fcfac082b753a855aeb449ebe375960c9769a9f0d135a55f0115b6a1-merged.mount: Deactivated successfully.
Dec 03 21:15:39 compute-0 podman[125400]: 2025-12-03 21:15:39.42383247 +0000 UTC m=+0.593879284 container remove 04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_mayer, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:15:39 compute-0 systemd[1]: libpod-conmon-04aa35afadbc36990f9df35a2e210933e48acaaf709b606f95a4c8a6f1b25f1c.scope: Deactivated successfully.
Dec 03 21:15:39 compute-0 sudo[125178]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:39 compute-0 sudo[125545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:15:39 compute-0 sudo[125545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:39 compute-0 sudo[125545]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:39 compute-0 sudo[125594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:15:39 compute-0 sudo[125594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:39 compute-0 sudo[125645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amewiiedommdnxetxjdcvdqbkezqijwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796539.2992795-109-249232347770694/AnsiballZ_file.py'
Dec 03 21:15:39 compute-0 sudo[125645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:39 compute-0 python3.9[125647]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:39 compute-0 sudo[125645]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:39 compute-0 podman[125660]: 2025-12-03 21:15:39.899684714 +0000 UTC m=+0.057607107 container create d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_lamport, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:15:39 compute-0 systemd[1]: Started libpod-conmon-d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e.scope.
Dec 03 21:15:39 compute-0 podman[125660]: 2025-12-03 21:15:39.871831286 +0000 UTC m=+0.029753759 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:15:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:15:39 compute-0 podman[125660]: 2025-12-03 21:15:39.986929406 +0000 UTC m=+0.144851839 container init d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_lamport, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:15:39 compute-0 podman[125660]: 2025-12-03 21:15:39.999287799 +0000 UTC m=+0.157210192 container start d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_lamport, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:15:40 compute-0 kind_lamport[125700]: 167 167
Dec 03 21:15:40 compute-0 systemd[1]: libpod-d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e.scope: Deactivated successfully.
Dec 03 21:15:40 compute-0 conmon[125700]: conmon d8c021a5ad2b3e498311 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e.scope/container/memory.events
Dec 03 21:15:40 compute-0 podman[125660]: 2025-12-03 21:15:40.010851219 +0000 UTC m=+0.168773622 container attach d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_lamport, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_REF=tentacle, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:15:40 compute-0 podman[125660]: 2025-12-03 21:15:40.011380033 +0000 UTC m=+0.169302436 container died d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_lamport, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:15:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-63be4198155103ba1dc972305f67f215b2e5f88fd367bb6fdefe9dae06959d3c-merged.mount: Deactivated successfully.
Dec 03 21:15:40 compute-0 podman[125660]: 2025-12-03 21:15:40.087777824 +0000 UTC m=+0.245700207 container remove d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_lamport, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:15:40 compute-0 systemd[1]: libpod-conmon-d8c021a5ad2b3e4983117ec65f60ab8e08aa02dbbde5eea37e8afc2b5bc0bc8e.scope: Deactivated successfully.
Dec 03 21:15:40 compute-0 podman[125799]: 2025-12-03 21:15:40.303116885 +0000 UTC m=+0.046294024 container create c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_gould, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:15:40 compute-0 systemd[1]: Started libpod-conmon-c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109.scope.
Dec 03 21:15:40 compute-0 podman[125799]: 2025-12-03 21:15:40.284378062 +0000 UTC m=+0.027555211 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:15:40 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:15:40 compute-0 sudo[125868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkfjrjwyrvffimnedjfyoxkalsityzrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796540.036237-124-8675484940645/AnsiballZ_stat.py'
Dec 03 21:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4eeee5ec26d524cb3dfe64e0dd2c0a3abdaddab49540a404bef194622e228737/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4eeee5ec26d524cb3dfe64e0dd2c0a3abdaddab49540a404bef194622e228737/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4eeee5ec26d524cb3dfe64e0dd2c0a3abdaddab49540a404bef194622e228737/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4eeee5ec26d524cb3dfe64e0dd2c0a3abdaddab49540a404bef194622e228737/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:15:40 compute-0 sudo[125868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:40 compute-0 podman[125799]: 2025-12-03 21:15:40.472275366 +0000 UTC m=+0.215452545 container init c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_gould, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:15:40 compute-0 podman[125799]: 2025-12-03 21:15:40.482182992 +0000 UTC m=+0.225360141 container start c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_gould, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:15:40 compute-0 podman[125799]: 2025-12-03 21:15:40.485990805 +0000 UTC m=+0.229168014 container attach c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_gould, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, CEPH_REF=tentacle)
Dec 03 21:15:40 compute-0 ceph-mon[75204]: pgmap v272: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:40 compute-0 python3.9[125870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:40 compute-0 sudo[125868]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:40 compute-0 sudo[126036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmayoekmcqxjncvnnoejufnqpicjeqpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796540.036237-124-8675484940645/AnsiballZ_copy.py'
Dec 03 21:15:40 compute-0 sudo[126036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:41 compute-0 python3.9[126043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796540.036237-124-8675484940645/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bec64515721404c046714e3f0c8a7dec942f55c1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:41 compute-0 sudo[126036]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:41 compute-0 lvm[126070]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:15:41 compute-0 lvm[126070]: VG ceph_vg1 finished
Dec 03 21:15:41 compute-0 lvm[126069]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:15:41 compute-0 lvm[126069]: VG ceph_vg0 finished
Dec 03 21:15:41 compute-0 lvm[126073]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:15:41 compute-0 lvm[126073]: VG ceph_vg2 finished
Dec 03 21:15:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v273: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:41 compute-0 awesome_gould[125863]: {}
Dec 03 21:15:41 compute-0 systemd[1]: libpod-c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109.scope: Deactivated successfully.
Dec 03 21:15:41 compute-0 podman[125799]: 2025-12-03 21:15:41.331018379 +0000 UTC m=+1.074195558 container died c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_gould, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:15:41 compute-0 systemd[1]: libpod-c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109.scope: Consumed 1.354s CPU time.
Dec 03 21:15:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-4eeee5ec26d524cb3dfe64e0dd2c0a3abdaddab49540a404bef194622e228737-merged.mount: Deactivated successfully.
Dec 03 21:15:41 compute-0 podman[125799]: 2025-12-03 21:15:41.387251559 +0000 UTC m=+1.130428708 container remove c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_gould, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:15:41 compute-0 systemd[1]: libpod-conmon-c425d3efbe04513964865b44ba1df0cf5bb09cda433718d18280a3ad10ba8109.scope: Deactivated successfully.
Dec 03 21:15:41 compute-0 sudo[125594]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:15:41 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:15:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:15:41 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:15:41 compute-0 sudo[126181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:15:41 compute-0 sudo[126181]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:15:41 compute-0 sudo[126181]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:41 compute-0 sudo[126262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkwxbjiozuzbpvnxajixlidpbhnfxtdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796541.3217268-124-98192508430848/AnsiballZ_stat.py'
Dec 03 21:15:41 compute-0 sudo[126262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:41 compute-0 python3.9[126264]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:41 compute-0 sudo[126262]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:42 compute-0 sudo[126385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvhdmactfmjsmidfxxcwgjaurhlvntvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796541.3217268-124-98192508430848/AnsiballZ_copy.py'
Dec 03 21:15:42 compute-0 sudo[126385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:42 compute-0 ceph-mon[75204]: pgmap v273: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:15:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:15:42 compute-0 python3.9[126387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796541.3217268-124-98192508430848/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8344db67344de20b5740c0f08184ccf2d3f2112a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:42 compute-0 sudo[126385]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:43 compute-0 sudo[126537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwqctmuxnbbcgwwwnnbihcrjgyvmuiyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796542.7574608-124-170406248105144/AnsiballZ_stat.py'
Dec 03 21:15:43 compute-0 sudo[126537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:43 compute-0 python3.9[126539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v274: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:43 compute-0 sudo[126537]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:43 compute-0 sudo[126660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxgcqbobqkzcvhldhyhqptbghxbrzpat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796542.7574608-124-170406248105144/AnsiballZ_copy.py'
Dec 03 21:15:43 compute-0 sudo[126660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:43 compute-0 ceph-mon[75204]: pgmap v274: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:44 compute-0 python3.9[126662]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796542.7574608-124-170406248105144/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=ee36923c9157e342b04e55ed72f2c679eadd113c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:44 compute-0 sudo[126660]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:44 compute-0 sudo[126812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aenixfhjrlhphcdkiuphobdetbyxuyht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796544.2615998-168-246703530781430/AnsiballZ_file.py'
Dec 03 21:15:44 compute-0 sudo[126812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:44 compute-0 python3.9[126814]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:44 compute-0 sudo[126812]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:45 compute-0 sudo[126964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yheyutsbyznkloufumhprjtaitaremym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796544.9307005-168-4248893897192/AnsiballZ_file.py'
Dec 03 21:15:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v275: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:45 compute-0 sudo[126964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:45 compute-0 python3.9[126966]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:45 compute-0 sudo[126964]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:46 compute-0 sudo[127116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hivcoevbphmuphxbmuvdjullmemfvdaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796545.740133-183-71369721065533/AnsiballZ_stat.py'
Dec 03 21:15:46 compute-0 sudo[127116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:46 compute-0 python3.9[127118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:46 compute-0 sudo[127116]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:46 compute-0 ceph-mon[75204]: pgmap v275: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:46 compute-0 sudo[127239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbutjtthfuvauyfskvxllzdssyallsyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796545.740133-183-71369721065533/AnsiballZ_copy.py'
Dec 03 21:15:46 compute-0 sudo[127239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:46 compute-0 python3.9[127241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796545.740133-183-71369721065533/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=a766d8357ea3590ae1b89bf19947a192ebb63bce backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:46 compute-0 sudo[127239]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v276: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:47 compute-0 sudo[127391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uerynawwmadbqzrpvqblczgctrixejqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796547.1255841-183-165062591605828/AnsiballZ_stat.py'
Dec 03 21:15:47 compute-0 sudo[127391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:47 compute-0 python3.9[127393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:47 compute-0 sudo[127391]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:48 compute-0 sudo[127514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqyaixorvuspzczlxhsqbhebfwibvmll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796547.1255841-183-165062591605828/AnsiballZ_copy.py'
Dec 03 21:15:48 compute-0 sudo[127514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:48 compute-0 python3.9[127516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796547.1255841-183-165062591605828/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8344db67344de20b5740c0f08184ccf2d3f2112a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:48 compute-0 sudo[127514]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:48 compute-0 ceph-mon[75204]: pgmap v276: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:48 compute-0 sudo[127666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrilcigmetxfxoibjytzhnqjwcztskfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796548.404328-183-234449418829706/AnsiballZ_stat.py'
Dec 03 21:15:48 compute-0 sudo[127666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:48 compute-0 python3.9[127668]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:48 compute-0 sudo[127666]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v277: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:49 compute-0 sudo[127789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxoalxwgvdimtrvfhaysivnryliiwiyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796548.404328-183-234449418829706/AnsiballZ_copy.py'
Dec 03 21:15:49 compute-0 sudo[127789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:49 compute-0 python3.9[127791]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796548.404328-183-234449418829706/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=ef7e5e8bbf0cfdcf910f97bed4ab276d0d1fbcac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:49 compute-0 sudo[127789]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:50 compute-0 ceph-mon[75204]: pgmap v277: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:50 compute-0 sudo[127941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pobctozmnnjoxjsydnygmvsjktneiiff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796550.2965224-243-178642102338470/AnsiballZ_file.py'
Dec 03 21:15:50 compute-0 sudo[127941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:50 compute-0 python3.9[127943]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:50 compute-0 sudo[127941]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v278: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:51 compute-0 sudo[128093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptmmrhgaycedmyoedmeiikginlzmvwty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796551.014292-251-179347464787912/AnsiballZ_stat.py'
Dec 03 21:15:51 compute-0 sudo[128093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:51 compute-0 python3.9[128095]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:51 compute-0 sudo[128093]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:15:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:15:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:15:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:15:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:15:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:15:51 compute-0 sudo[128216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wejvvknjhlykwntlvcgfqofyqylvyycz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796551.014292-251-179347464787912/AnsiballZ_copy.py'
Dec 03 21:15:51 compute-0 sudo[128216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:52 compute-0 python3.9[128218]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796551.014292-251-179347464787912/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b6113f6c3c3112d11c0348cd0a11619cc2e5f10c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:52 compute-0 sudo[128216]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:52 compute-0 ceph-mon[75204]: pgmap v278: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:52 compute-0 sudo[128368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqalocywujtjffcdfpffszdotpxzgtmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796552.4135358-267-245024220542289/AnsiballZ_file.py'
Dec 03 21:15:52 compute-0 sudo[128368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:52 compute-0 python3.9[128370]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:52 compute-0 sudo[128368]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v279: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:53 compute-0 sudo[128520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gikjqstcdqpdfupkeondgivzfeygbzaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796553.172751-275-110786301521603/AnsiballZ_stat.py'
Dec 03 21:15:53 compute-0 sudo[128520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:53 compute-0 python3.9[128522]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:53 compute-0 sudo[128520]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:54 compute-0 sudo[128643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzzinhkeeqhdqlebehotlulqexgadqbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796553.172751-275-110786301521603/AnsiballZ_copy.py'
Dec 03 21:15:54 compute-0 sudo[128643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:54 compute-0 python3.9[128645]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796553.172751-275-110786301521603/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b6113f6c3c3112d11c0348cd0a11619cc2e5f10c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:54 compute-0 sudo[128643]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:54 compute-0 ceph-mon[75204]: pgmap v279: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:54 compute-0 sudo[128795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wooyikumctwopptlhnsxrmzmjqegelnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796554.5874085-291-10275609238144/AnsiballZ_file.py'
Dec 03 21:15:54 compute-0 sudo[128795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:55 compute-0 python3.9[128797]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:55 compute-0 sudo[128795]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v280: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:55 compute-0 sudo[128947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifrpbqjtwhlrylxggcphiyapjthgdqdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796555.3093896-299-74204046778787/AnsiballZ_stat.py'
Dec 03 21:15:55 compute-0 sudo[128947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:55 compute-0 python3.9[128949]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:55 compute-0 sudo[128947]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:56 compute-0 sudo[129070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czznetdfzypoayzuspluzfiovnnlbrhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796555.3093896-299-74204046778787/AnsiballZ_copy.py'
Dec 03 21:15:56 compute-0 sudo[129070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:56 compute-0 python3.9[129072]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796555.3093896-299-74204046778787/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b6113f6c3c3112d11c0348cd0a11619cc2e5f10c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:56 compute-0 sudo[129070]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:56 compute-0 ceph-mon[75204]: pgmap v280: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:56 compute-0 sudo[129222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaicxmkhixfggwjdxjyzknepocsarmbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796556.500073-315-34117115039239/AnsiballZ_file.py'
Dec 03 21:15:56 compute-0 sudo[129222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:56 compute-0 python3.9[129224]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:57 compute-0 sudo[129222]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v281: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:57 compute-0 sudo[129374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udfhxljaknocpmizxjhdherfzrttihrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796557.14238-323-143196744162433/AnsiballZ_stat.py'
Dec 03 21:15:57 compute-0 sudo[129374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:57 compute-0 python3.9[129376]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:57 compute-0 sudo[129374]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:15:57 compute-0 sudo[129497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxdxjcrugfikrwxlrxfbeldtuogfldac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796557.14238-323-143196744162433/AnsiballZ_copy.py'
Dec 03 21:15:57 compute-0 sudo[129497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:58 compute-0 python3.9[129499]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796557.14238-323-143196744162433/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b6113f6c3c3112d11c0348cd0a11619cc2e5f10c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:15:58 compute-0 sudo[129497]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:58 compute-0 ceph-mon[75204]: pgmap v281: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:58 compute-0 sudo[129649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbynuaeefeztdjpymabpvmoqgrvtixxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796558.2928333-339-45388270290977/AnsiballZ_file.py'
Dec 03 21:15:58 compute-0 sudo[129649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:58 compute-0 python3.9[129651]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:15:58 compute-0 sudo[129649]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v282: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:15:59 compute-0 sudo[129801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpcrdjaozlyfesfsndcxxytkqbmatqfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796558.9222195-347-56648605168673/AnsiballZ_stat.py'
Dec 03 21:15:59 compute-0 sudo[129801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:15:59 compute-0 python3.9[129803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:15:59 compute-0 sudo[129801]: pam_unix(sudo:session): session closed for user root
Dec 03 21:15:59 compute-0 sudo[129924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfcuhcuvywmtlovokbovkeeiemztevyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796558.9222195-347-56648605168673/AnsiballZ_copy.py'
Dec 03 21:15:59 compute-0 sudo[129924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:00 compute-0 python3.9[129926]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796558.9222195-347-56648605168673/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b6113f6c3c3112d11c0348cd0a11619cc2e5f10c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:00 compute-0 sudo[129924]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:00 compute-0 ceph-mon[75204]: pgmap v282: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #18. Immutable memtables: 0.
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.435135) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 18
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796560435255, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6622, "num_deletes": 251, "total_data_size": 7659381, "memory_usage": 7795256, "flush_reason": "Manual Compaction"}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #19: started
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796560485687, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 19, "file_size": 5723224, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 146, "largest_seqno": 6765, "table_properties": {"data_size": 5699882, "index_size": 15164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7109, "raw_key_size": 62775, "raw_average_key_size": 22, "raw_value_size": 5645795, "raw_average_value_size": 1993, "num_data_blocks": 680, "num_entries": 2832, "num_filter_entries": 2832, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796081, "oldest_key_time": 1764796081, "file_creation_time": 1764796560, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 19, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 50607 microseconds, and 15131 cpu microseconds.
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.485746) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #19: 5723224 bytes OK
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.485772) [db/memtable_list.cc:519] [default] Level-0 commit table #19 started
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.488358) [db/memtable_list.cc:722] [default] Level-0 commit table #19: memtable #1 done
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.488377) EVENT_LOG_v1 {"time_micros": 1764796560488372, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [3, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.488406) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[3 0 0 0 0 0 0] max score 0.75
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 7631436, prev total WAL file size 7631436, number of live WAL files 2.
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000014.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.490333) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 3@0 files to L6, score -1.00
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [19(5589KB) 13(58KB) 8(1944B)]
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796560490452, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [19, 13, 8], "score": -1, "input_data_size": 5785128, "oldest_snapshot_seqno": -1}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #20: 2658 keys, 5738140 bytes, temperature: kUnknown
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796560531962, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 20, "file_size": 5738140, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5715125, "index_size": 15290, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6661, "raw_key_size": 61215, "raw_average_key_size": 23, "raw_value_size": 5662315, "raw_average_value_size": 2130, "num_data_blocks": 686, "num_entries": 2658, "num_filter_entries": 2658, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764796560, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.532320) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 3@0 files to L6 => 5738140 bytes
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.533922) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 138.7 rd, 137.5 wr, level 6, files in(3, 0) out(1 +0 blob) MB in(5.5, 0.0 +0.0 blob) out(5.5 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 2947, records dropped: 289 output_compression: NoCompression
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.533945) EVENT_LOG_v1 {"time_micros": 1764796560533934, "job": 4, "event": "compaction_finished", "compaction_time_micros": 41717, "compaction_time_cpu_micros": 15265, "output_level": 6, "num_output_files": 1, "total_output_size": 5738140, "num_input_records": 2947, "num_output_records": 2658, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000019.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796560535865, "job": 4, "event": "table_file_deletion", "file_number": 19}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000013.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796560536127, "job": 4, "event": "table_file_deletion", "file_number": 13}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796560536400, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 03 21:16:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:16:00.490203) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:16:00 compute-0 sudo[130077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwcsybjztzhlblvmxuwqgdeqhkcnvvwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796560.363627-363-143335774256052/AnsiballZ_file.py'
Dec 03 21:16:00 compute-0 sudo[130077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:00 compute-0 python3.9[130079]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:16:00 compute-0 sudo[130077]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v283: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:01 compute-0 sudo[130229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpsllkltyppfajqfxvgtzxlmstgbgfsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796561.1422749-371-165587659249820/AnsiballZ_stat.py'
Dec 03 21:16:01 compute-0 sudo[130229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:01 compute-0 python3.9[130231]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:01 compute-0 sudo[130229]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:02 compute-0 sudo[130352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bccbzusommugosurttlcmrrjwebnatcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796561.1422749-371-165587659249820/AnsiballZ_copy.py'
Dec 03 21:16:02 compute-0 sudo[130352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:02 compute-0 python3.9[130354]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796561.1422749-371-165587659249820/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=b6113f6c3c3112d11c0348cd0a11619cc2e5f10c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:02 compute-0 sudo[130352]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:02 compute-0 ceph-mon[75204]: pgmap v283: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:02 compute-0 sshd-session[123634]: Connection closed by 192.168.122.30 port 41146
Dec 03 21:16:02 compute-0 sshd-session[123631]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:16:02 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Dec 03 21:16:02 compute-0 systemd[1]: session-43.scope: Consumed 25.634s CPU time.
Dec 03 21:16:02 compute-0 systemd-logind[787]: Session 43 logged out. Waiting for processes to exit.
Dec 03 21:16:02 compute-0 systemd-logind[787]: Removed session 43.
Dec 03 21:16:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v284: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:04 compute-0 ceph-mon[75204]: pgmap v284: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v285: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:06 compute-0 ceph-mon[75204]: pgmap v285: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v286: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:08 compute-0 sshd-session[130379]: Accepted publickey for zuul from 192.168.122.30 port 46082 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:16:08 compute-0 systemd-logind[787]: New session 44 of user zuul.
Dec 03 21:16:08 compute-0 systemd[1]: Started Session 44 of User zuul.
Dec 03 21:16:08 compute-0 sshd-session[130379]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:16:08 compute-0 ceph-mon[75204]: pgmap v286: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:08 compute-0 sudo[130532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipvldwkoyqrzhepzwghlqjrnqojhondi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796568.3644965-22-164340386389574/AnsiballZ_file.py'
Dec 03 21:16:08 compute-0 sudo[130532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:09 compute-0 python3.9[130534]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:09 compute-0 sudo[130532]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v287: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:09 compute-0 sudo[130684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykxsqenghhndusywijsleutxwxfsaqbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796569.2706401-34-48008386959101/AnsiballZ_stat.py'
Dec 03 21:16:09 compute-0 sudo[130684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:09 compute-0 python3.9[130686]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:09 compute-0 sudo[130684]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:10 compute-0 ceph-mon[75204]: pgmap v287: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:10 compute-0 sudo[130807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjhwvpcygxfzpubugsciymkfewchszcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796569.2706401-34-48008386959101/AnsiballZ_copy.py'
Dec 03 21:16:10 compute-0 sudo[130807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:10 compute-0 python3.9[130809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764796569.2706401-34-48008386959101/.source.conf _original_basename=ceph.conf follow=False checksum=61832579ecbf8b3bbfd3eb7faf9249a287d8a08d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:10 compute-0 sudo[130807]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v288: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:11 compute-0 sudo[130959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkkreobrxtpnjeyaakhvedbhmyhxosiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796570.9435086-34-269877758566084/AnsiballZ_stat.py'
Dec 03 21:16:11 compute-0 sudo[130959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:11 compute-0 python3.9[130961]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:11 compute-0 sudo[130959]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:11 compute-0 sudo[131082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czbvyslormhydiruqeugudqqrfeiyvyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796570.9435086-34-269877758566084/AnsiballZ_copy.py'
Dec 03 21:16:11 compute-0 sudo[131082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:12 compute-0 python3.9[131084]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764796570.9435086-34-269877758566084/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=100907596fddba72a04e8a16770dbec161f9317a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:12 compute-0 sudo[131082]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:12 compute-0 ceph-mon[75204]: pgmap v288: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:12 compute-0 sshd-session[130382]: Connection closed by 192.168.122.30 port 46082
Dec 03 21:16:12 compute-0 sshd-session[130379]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:16:12 compute-0 systemd[1]: session-44.scope: Deactivated successfully.
Dec 03 21:16:12 compute-0 systemd[1]: session-44.scope: Consumed 3.059s CPU time.
Dec 03 21:16:12 compute-0 systemd-logind[787]: Session 44 logged out. Waiting for processes to exit.
Dec 03 21:16:12 compute-0 systemd-logind[787]: Removed session 44.
Dec 03 21:16:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v289: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:14 compute-0 ceph-mon[75204]: pgmap v289: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v290: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:16 compute-0 ceph-mon[75204]: pgmap v290: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v291: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:18 compute-0 ceph-mon[75204]: pgmap v291: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:18 compute-0 sshd-session[131111]: Accepted publickey for zuul from 192.168.122.30 port 38152 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:16:18 compute-0 systemd-logind[787]: New session 45 of user zuul.
Dec 03 21:16:18 compute-0 systemd[1]: Started Session 45 of User zuul.
Dec 03 21:16:18 compute-0 sshd-session[131111]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:16:18 compute-0 sshd-session[131109]: Connection closed by authenticating user root 103.146.202.84 port 51880 [preauth]
Dec 03 21:16:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v292: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:19 compute-0 python3.9[131264]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:16:20 compute-0 ceph-mon[75204]: pgmap v292: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:20 compute-0 sudo[131418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xebhlrrulpqozoikacbuflphqsxjfeoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796580.3031812-34-162151357138059/AnsiballZ_file.py'
Dec 03 21:16:20 compute-0 sudo[131418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:21 compute-0 python3.9[131420]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:16:21 compute-0 sudo[131418]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:16:21
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.meta', '.mgr', 'images', 'vms', 'cephfs.cephfs.data', 'volumes']
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v293: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:21 compute-0 sudo[131570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdyzhdpzhweggcmzkddefztywyyvtehs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796581.2781346-34-184129074421675/AnsiballZ_file.py'
Dec 03 21:16:21 compute-0 sudo[131570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:16:21 compute-0 python3.9[131572]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:16:21 compute-0 sudo[131570]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:16:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:16:22 compute-0 ceph-mon[75204]: pgmap v293: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:22 compute-0 python3.9[131722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:16:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v294: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:23 compute-0 sudo[131872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqrbruhoxukxxafncfwrzvacdyzbxven ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796582.9038892-57-270915637029930/AnsiballZ_seboolean.py'
Dec 03 21:16:23 compute-0 sudo[131872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:23 compute-0 python3.9[131874]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 03 21:16:24 compute-0 ceph-mon[75204]: pgmap v294: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:24 compute-0 sudo[131872]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v295: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:25 compute-0 sudo[132028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixbpeaptzfjdgwhrwgfzwbaxocibsdfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796585.2735674-67-263076771163709/AnsiballZ_setup.py'
Dec 03 21:16:25 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 03 21:16:25 compute-0 sudo[132028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:25 compute-0 python3.9[132030]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:16:26 compute-0 sudo[132028]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:26 compute-0 ceph-mon[75204]: pgmap v295: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:26 compute-0 sudo[132112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksmqbskdtzaelsjaoncnikvlxsvodgip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796585.2735674-67-263076771163709/AnsiballZ_dnf.py'
Dec 03 21:16:26 compute-0 sudo[132112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:26 compute-0 python3.9[132114]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v296: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:16:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:16:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:28 compute-0 sudo[132112]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:28 compute-0 ceph-mon[75204]: pgmap v296: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:28 compute-0 sudo[132267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlltixqnoawcwklbpsvvkmkmwpydkmsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796588.28628-79-234554893791333/AnsiballZ_systemd.py'
Dec 03 21:16:28 compute-0 sudo[132267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:29 compute-0 python3.9[132269]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:16:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v297: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:29 compute-0 sshd-session[132221]: Received disconnect from 80.94.93.233 port 46216:11:  [preauth]
Dec 03 21:16:29 compute-0 sshd-session[132221]: Disconnected from authenticating user root 80.94.93.233 port 46216 [preauth]
Dec 03 21:16:30 compute-0 sudo[132267]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:30 compute-0 ceph-mon[75204]: pgmap v297: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:31 compute-0 sudo[132422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpyzmpholqqjhfpclwjqxrkxgsbljooq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764796590.5442204-87-66103836487678/AnsiballZ_edpm_nftables_snippet.py'
Dec 03 21:16:31 compute-0 sudo[132422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:31 compute-0 python3[132424]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 03 21:16:31 compute-0 sudo[132422]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v298: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:31 compute-0 sudo[132574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heofxoteavihlwwemplmdnxatjiddecf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796591.52769-96-86831645247528/AnsiballZ_file.py'
Dec 03 21:16:31 compute-0 sudo[132574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:32 compute-0 python3.9[132576]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:32 compute-0 sudo[132574]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:32 compute-0 ceph-mon[75204]: pgmap v298: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:32 compute-0 sudo[132726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfpkcmsukzwtimmqxsqfjvpnkaleqeik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796592.2037103-104-238359330370102/AnsiballZ_stat.py'
Dec 03 21:16:32 compute-0 sudo[132726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:32 compute-0 python3.9[132728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:32 compute-0 sudo[132726]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:33 compute-0 sudo[132804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tphkbslbghkmqtufanzootunnzfnevnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796592.2037103-104-238359330370102/AnsiballZ_file.py'
Dec 03 21:16:33 compute-0 sudo[132804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v299: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:33 compute-0 python3.9[132806]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:33 compute-0 sudo[132804]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:34 compute-0 sudo[132956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brvgeiydjttdclxbzsjdfrjarszbexiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796593.6559284-116-218272815387398/AnsiballZ_stat.py'
Dec 03 21:16:34 compute-0 sudo[132956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:34 compute-0 python3.9[132958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:34 compute-0 sudo[132956]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:34 compute-0 ceph-mon[75204]: pgmap v299: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:34 compute-0 sudo[133034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulayszfjmvzbruhizjatnlbxnrnbefrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796593.6559284-116-218272815387398/AnsiballZ_file.py'
Dec 03 21:16:34 compute-0 sudo[133034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:34 compute-0 python3.9[133036]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.l67wsdtv recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:34 compute-0 sudo[133034]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v300: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:35 compute-0 sudo[133186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqucicieznfuiynreiphwrzkaxtymgwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796594.970187-128-234292490096094/AnsiballZ_stat.py'
Dec 03 21:16:35 compute-0 sudo[133186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:35 compute-0 python3.9[133188]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:35 compute-0 sudo[133186]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:35 compute-0 sudo[133264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pctqvvmiqkrgemlhjnxzbopebmiabeyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796594.970187-128-234292490096094/AnsiballZ_file.py'
Dec 03 21:16:35 compute-0 sudo[133264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:36 compute-0 python3.9[133266]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:36 compute-0 sudo[133264]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:36 compute-0 ceph-mon[75204]: pgmap v300: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:36 compute-0 sudo[133416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpotkkgssvqphctceepjxabuaivnxzna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796596.3600016-141-271481534704979/AnsiballZ_command.py'
Dec 03 21:16:36 compute-0 sudo[133416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:37 compute-0 python3.9[133418]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:16:37 compute-0 sudo[133416]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v301: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:37 compute-0 sudo[133569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fycfmejhhwvudhujihyolmqaaibzqgta ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764796597.2557623-149-54370551567284/AnsiballZ_edpm_nftables_from_files.py'
Dec 03 21:16:37 compute-0 sudo[133569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:37 compute-0 python3[133571]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 03 21:16:38 compute-0 sudo[133569]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:38 compute-0 ceph-mon[75204]: pgmap v301: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:38 compute-0 sudo[133721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnpufvzcwpcjrzpepqyswxgzrvybvrdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796598.1872478-157-200988617254568/AnsiballZ_stat.py'
Dec 03 21:16:38 compute-0 sudo[133721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:38 compute-0 python3.9[133723]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:38 compute-0 sudo[133721]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v302: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:39 compute-0 sudo[133846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bktdqvdjpqlhvgupbkmvnfefvgvhvahp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796598.1872478-157-200988617254568/AnsiballZ_copy.py'
Dec 03 21:16:39 compute-0 sudo[133846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:39 compute-0 python3.9[133848]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796598.1872478-157-200988617254568/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:39 compute-0 sudo[133846]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:40 compute-0 sudo[133998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzqlftgfmfsoxmeqwoeiaiouchqnnigv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796599.670772-172-77219131172768/AnsiballZ_stat.py'
Dec 03 21:16:40 compute-0 sudo[133998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:40 compute-0 python3.9[134000]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:40 compute-0 sudo[133998]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:40 compute-0 ceph-mon[75204]: pgmap v302: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:40 compute-0 sudo[134123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xskfjxffmzoielcqqlorwkdiueilnoof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796599.670772-172-77219131172768/AnsiballZ_copy.py'
Dec 03 21:16:40 compute-0 sudo[134123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:40 compute-0 python3.9[134125]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796599.670772-172-77219131172768/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:40 compute-0 sudo[134123]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v303: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:41 compute-0 sudo[134275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epzwpvgcweepwvwxrfgztfilvyybggrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796601.014516-187-131769648164895/AnsiballZ_stat.py'
Dec 03 21:16:41 compute-0 sudo[134275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:41 compute-0 python3.9[134277]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:41 compute-0 sudo[134278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:16:41 compute-0 sudo[134278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:41 compute-0 sudo[134278]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:41 compute-0 sudo[134275]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:41 compute-0 sudo[134305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:16:41 compute-0 sudo[134305]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:42 compute-0 sudo[134465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygifvukzymclbimtudillfvrgvidtjwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796601.014516-187-131769648164895/AnsiballZ_copy.py'
Dec 03 21:16:42 compute-0 sudo[134465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:42 compute-0 python3.9[134467]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796601.014516-187-131769648164895/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:42 compute-0 sudo[134465]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:42 compute-0 sudo[134305]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:16:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:16:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:16:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:16:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:16:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:16:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:16:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:16:42 compute-0 sudo[134509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:16:42 compute-0 sudo[134509]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:42 compute-0 sudo[134509]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:42 compute-0 sudo[134541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:16:42 compute-0 sudo[134541]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:42 compute-0 ceph-mon[75204]: pgmap v303: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:16:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:16:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:42 compute-0 sudo[134710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzlilknvkuwdmsersmocomxdmuexodfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796602.4339132-202-30793087710172/AnsiballZ_stat.py'
Dec 03 21:16:42 compute-0 sudo[134710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:42 compute-0 podman[134670]: 2025-12-03 21:16:42.770082583 +0000 UTC m=+0.057863764 container create ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=great_hypatia, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:16:42 compute-0 systemd[1]: Started libpod-conmon-ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830.scope.
Dec 03 21:16:42 compute-0 podman[134670]: 2025-12-03 21:16:42.741720047 +0000 UTC m=+0.029501288 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:16:42 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:16:42 compute-0 podman[134670]: 2025-12-03 21:16:42.870136692 +0000 UTC m=+0.157917923 container init ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=great_hypatia, org.label-schema.build-date=20251030, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:16:42 compute-0 podman[134670]: 2025-12-03 21:16:42.885413449 +0000 UTC m=+0.173194640 container start ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=great_hypatia, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:16:42 compute-0 podman[134670]: 2025-12-03 21:16:42.889444036 +0000 UTC m=+0.177225227 container attach ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=great_hypatia, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:16:42 compute-0 great_hypatia[134716]: 167 167
Dec 03 21:16:42 compute-0 systemd[1]: libpod-ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830.scope: Deactivated successfully.
Dec 03 21:16:42 compute-0 podman[134670]: 2025-12-03 21:16:42.896625738 +0000 UTC m=+0.184406949 container died ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=great_hypatia, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:16:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-59bbda1054c5e6772175430c2c210aba1729cc58c86dfb6cffaad54ec419d32e-merged.mount: Deactivated successfully.
Dec 03 21:16:42 compute-0 podman[134670]: 2025-12-03 21:16:42.954141842 +0000 UTC m=+0.241923033 container remove ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=great_hypatia, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:16:42 compute-0 python3.9[134712]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:42 compute-0 systemd[1]: libpod-conmon-ac1da2f7ced58262ef0e1b90c3990d6cb004173acb76740f047bf1c6d9fdf830.scope: Deactivated successfully.
Dec 03 21:16:43 compute-0 sudo[134710]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:43 compute-0 podman[134764]: 2025-12-03 21:16:43.191995375 +0000 UTC m=+0.082460780 container create f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_elbakyan, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:16:43 compute-0 podman[134764]: 2025-12-03 21:16:43.141882489 +0000 UTC m=+0.032347984 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:16:43 compute-0 systemd[1]: Started libpod-conmon-f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278.scope.
Dec 03 21:16:43 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:16:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ead3787b3bd20dc938b0e05759d06469b89b24829645bf59aad67bd3f5bba1ea/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ead3787b3bd20dc938b0e05759d06469b89b24829645bf59aad67bd3f5bba1ea/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ead3787b3bd20dc938b0e05759d06469b89b24829645bf59aad67bd3f5bba1ea/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ead3787b3bd20dc938b0e05759d06469b89b24829645bf59aad67bd3f5bba1ea/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ead3787b3bd20dc938b0e05759d06469b89b24829645bf59aad67bd3f5bba1ea/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:43 compute-0 podman[134764]: 2025-12-03 21:16:43.307409364 +0000 UTC m=+0.197874789 container init f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_elbakyan, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:16:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v304: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:43 compute-0 podman[134764]: 2025-12-03 21:16:43.325325971 +0000 UTC m=+0.215791376 container start f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:16:43 compute-0 podman[134764]: 2025-12-03 21:16:43.330025347 +0000 UTC m=+0.220490772 container attach f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_elbakyan, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:16:43 compute-0 sudo[134883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krbydnakilwjwrocefypxsvtsggagvqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796602.4339132-202-30793087710172/AnsiballZ_copy.py'
Dec 03 21:16:43 compute-0 sudo[134883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:43 compute-0 python3.9[134885]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796602.4339132-202-30793087710172/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:43 compute-0 sudo[134883]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:43 compute-0 pedantic_elbakyan[134828]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:16:43 compute-0 pedantic_elbakyan[134828]: --> All data devices are unavailable
Dec 03 21:16:43 compute-0 systemd[1]: libpod-f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278.scope: Deactivated successfully.
Dec 03 21:16:43 compute-0 podman[134764]: 2025-12-03 21:16:43.929134075 +0000 UTC m=+0.819599480 container died f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_elbakyan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:16:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-ead3787b3bd20dc938b0e05759d06469b89b24829645bf59aad67bd3f5bba1ea-merged.mount: Deactivated successfully.
Dec 03 21:16:43 compute-0 podman[134764]: 2025-12-03 21:16:43.992823923 +0000 UTC m=+0.883289338 container remove f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_elbakyan, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:16:44 compute-0 systemd[1]: libpod-conmon-f85d87aaf4f44bac521e531a3cb341bf47252f37c273b1b3d87cb0af9c9da278.scope: Deactivated successfully.
Dec 03 21:16:44 compute-0 sudo[134541]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:44 compute-0 sudo[134988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:16:44 compute-0 sudo[134988]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:44 compute-0 sudo[134988]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:44 compute-0 sudo[135021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:16:44 compute-0 sudo[135021]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:44 compute-0 sudo[135111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhxkyrxxbfhfumzxjkzmbqclgvfaesjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796603.9000785-217-75560881586217/AnsiballZ_stat.py'
Dec 03 21:16:44 compute-0 sudo[135111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:44 compute-0 podman[135126]: 2025-12-03 21:16:44.415686291 +0000 UTC m=+0.041917659 container create fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_grothendieck, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:16:44 compute-0 systemd[1]: Started libpod-conmon-fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f.scope.
Dec 03 21:16:44 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:16:44 compute-0 podman[135126]: 2025-12-03 21:16:44.396443958 +0000 UTC m=+0.022675326 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:16:44 compute-0 podman[135126]: 2025-12-03 21:16:44.493097746 +0000 UTC m=+0.119329104 container init fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_grothendieck, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:16:44 compute-0 podman[135126]: 2025-12-03 21:16:44.50035761 +0000 UTC m=+0.126588958 container start fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_grothendieck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:16:44 compute-0 podman[135126]: 2025-12-03 21:16:44.503948796 +0000 UTC m=+0.130180214 container attach fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_grothendieck, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:16:44 compute-0 ceph-mon[75204]: pgmap v304: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:44 compute-0 focused_grothendieck[135143]: 167 167
Dec 03 21:16:44 compute-0 systemd[1]: libpod-fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f.scope: Deactivated successfully.
Dec 03 21:16:44 compute-0 conmon[135143]: conmon fa068596cf1b3e98d1bf <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f.scope/container/memory.events
Dec 03 21:16:44 compute-0 podman[135126]: 2025-12-03 21:16:44.511105416 +0000 UTC m=+0.137336814 container died fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_grothendieck, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:16:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-e2ea9fca907de7d17d2c7bfc173461d3148e460bc9cec11cb3909145316a2a12-merged.mount: Deactivated successfully.
Dec 03 21:16:44 compute-0 podman[135126]: 2025-12-03 21:16:44.555073809 +0000 UTC m=+0.181305167 container remove fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_grothendieck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:16:44 compute-0 systemd[1]: libpod-conmon-fa068596cf1b3e98d1bf48b4fc9821277a7168c7f3bb309f72afd7a240e8490f.scope: Deactivated successfully.
Dec 03 21:16:44 compute-0 python3.9[135113]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:44 compute-0 sudo[135111]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:44 compute-0 podman[135191]: 2025-12-03 21:16:44.772011614 +0000 UTC m=+0.058285355 container create d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_ride, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:16:44 compute-0 systemd[1]: Started libpod-conmon-d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22.scope.
Dec 03 21:16:44 compute-0 podman[135191]: 2025-12-03 21:16:44.74522988 +0000 UTC m=+0.031503711 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:16:44 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:16:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e3402d84e47cb21cfb44561de0c608e58c7b602df9b7129ac7e92ade1e91b3c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e3402d84e47cb21cfb44561de0c608e58c7b602df9b7129ac7e92ade1e91b3c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e3402d84e47cb21cfb44561de0c608e58c7b602df9b7129ac7e92ade1e91b3c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e3402d84e47cb21cfb44561de0c608e58c7b602df9b7129ac7e92ade1e91b3c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:44 compute-0 podman[135191]: 2025-12-03 21:16:44.866460093 +0000 UTC m=+0.152733924 container init d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_ride, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:16:44 compute-0 podman[135191]: 2025-12-03 21:16:44.879728488 +0000 UTC m=+0.166002239 container start d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_ride, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:16:44 compute-0 podman[135191]: 2025-12-03 21:16:44.883810896 +0000 UTC m=+0.170084707 container attach d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_ride, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030)
Dec 03 21:16:45 compute-0 sudo[135310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgsbydbdygfodthrqtqjoerwqtmwjrrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796603.9000785-217-75560881586217/AnsiballZ_copy.py'
Dec 03 21:16:45 compute-0 sudo[135310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:45 compute-0 quizzical_ride[135233]: {
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:     "0": [
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:         {
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "devices": [
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "/dev/loop3"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             ],
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_name": "ceph_lv0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_size": "21470642176",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "name": "ceph_lv0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "tags": {
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cluster_name": "ceph",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.crush_device_class": "",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.encrypted": "0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.objectstore": "bluestore",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osd_id": "0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.type": "block",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.vdo": "0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.with_tpm": "0"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             },
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "type": "block",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "vg_name": "ceph_vg0"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:         }
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:     ],
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:     "1": [
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:         {
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "devices": [
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "/dev/loop4"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             ],
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_name": "ceph_lv1",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_size": "21470642176",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "name": "ceph_lv1",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "tags": {
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cluster_name": "ceph",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.crush_device_class": "",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.encrypted": "0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.objectstore": "bluestore",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osd_id": "1",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.type": "block",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.vdo": "0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.with_tpm": "0"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             },
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "type": "block",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "vg_name": "ceph_vg1"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:         }
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:     ],
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:     "2": [
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:         {
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "devices": [
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "/dev/loop5"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             ],
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_name": "ceph_lv2",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_size": "21470642176",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "name": "ceph_lv2",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "tags": {
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.cluster_name": "ceph",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.crush_device_class": "",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.encrypted": "0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.objectstore": "bluestore",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osd_id": "2",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.type": "block",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.vdo": "0",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:                 "ceph.with_tpm": "0"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             },
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "type": "block",
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:             "vg_name": "ceph_vg2"
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:         }
Dec 03 21:16:45 compute-0 quizzical_ride[135233]:     ]
Dec 03 21:16:45 compute-0 quizzical_ride[135233]: }
Dec 03 21:16:45 compute-0 systemd[1]: libpod-d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22.scope: Deactivated successfully.
Dec 03 21:16:45 compute-0 podman[135191]: 2025-12-03 21:16:45.239371659 +0000 UTC m=+0.525645410 container died d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_ride, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle)
Dec 03 21:16:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-5e3402d84e47cb21cfb44561de0c608e58c7b602df9b7129ac7e92ade1e91b3c-merged.mount: Deactivated successfully.
Dec 03 21:16:45 compute-0 podman[135191]: 2025-12-03 21:16:45.287717299 +0000 UTC m=+0.573991040 container remove d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=quizzical_ride, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:16:45 compute-0 systemd[1]: libpod-conmon-d9d23daf99503ac0da35a6da5ab2755848dce16ea8434f5f0fb0cb42fb861d22.scope: Deactivated successfully.
Dec 03 21:16:45 compute-0 python3.9[135314]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796603.9000785-217-75560881586217/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v305: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:45 compute-0 sudo[135021]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:45 compute-0 sudo[135310]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:45 compute-0 sudo[135328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:16:45 compute-0 sudo[135328]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:45 compute-0 sudo[135328]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:45 compute-0 sudo[135373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:16:45 compute-0 sudo[135373]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:45 compute-0 podman[135474]: 2025-12-03 21:16:45.736905989 +0000 UTC m=+0.042445233 container create 223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_fermi, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030)
Dec 03 21:16:45 compute-0 systemd[1]: Started libpod-conmon-223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe.scope.
Dec 03 21:16:45 compute-0 podman[135474]: 2025-12-03 21:16:45.722160916 +0000 UTC m=+0.027700190 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:16:45 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:16:45 compute-0 podman[135474]: 2025-12-03 21:16:45.887896606 +0000 UTC m=+0.193435950 container init 223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_fermi, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle)
Dec 03 21:16:45 compute-0 podman[135474]: 2025-12-03 21:16:45.899954447 +0000 UTC m=+0.205493711 container start 223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_fermi, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:16:45 compute-0 podman[135474]: 2025-12-03 21:16:45.903332647 +0000 UTC m=+0.208871911 container attach 223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_fermi, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030)
Dec 03 21:16:45 compute-0 beautiful_fermi[135528]: 167 167
Dec 03 21:16:45 compute-0 systemd[1]: libpod-223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe.scope: Deactivated successfully.
Dec 03 21:16:45 compute-0 podman[135474]: 2025-12-03 21:16:45.907117448 +0000 UTC m=+0.212656742 container died 223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_fermi, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle)
Dec 03 21:16:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-ae6764862be8fbaecb4bb4d1ee582e5932338384d624f418b02c7bee793361e7-merged.mount: Deactivated successfully.
Dec 03 21:16:45 compute-0 podman[135474]: 2025-12-03 21:16:45.957730198 +0000 UTC m=+0.263269482 container remove 223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=beautiful_fermi, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:16:45 compute-0 sudo[135567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaariyfsvfhzwskniqudagsjftswkxnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796605.55627-232-152865440630034/AnsiballZ_file.py'
Dec 03 21:16:45 compute-0 sudo[135567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:45 compute-0 systemd[1]: libpod-conmon-223906ed88a43097c86978011bdc6b51e67d747c3664f2ea4de10a045019cbfe.scope: Deactivated successfully.
Dec 03 21:16:46 compute-0 python3.9[135573]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:46 compute-0 podman[135581]: 2025-12-03 21:16:46.192191801 +0000 UTC m=+0.067859311 container create 8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_driscoll, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle)
Dec 03 21:16:46 compute-0 sudo[135567]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:46 compute-0 systemd[1]: Started libpod-conmon-8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f.scope.
Dec 03 21:16:46 compute-0 podman[135581]: 2025-12-03 21:16:46.166505296 +0000 UTC m=+0.042172896 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:16:46 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:16:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c4242e3905bdc3436151a07c22ca297b83ef7169ac4ef1e61279222871b5fc2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c4242e3905bdc3436151a07c22ca297b83ef7169ac4ef1e61279222871b5fc2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c4242e3905bdc3436151a07c22ca297b83ef7169ac4ef1e61279222871b5fc2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c4242e3905bdc3436151a07c22ca297b83ef7169ac4ef1e61279222871b5fc2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:16:46 compute-0 podman[135581]: 2025-12-03 21:16:46.278557594 +0000 UTC m=+0.154225124 container init 8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_driscoll, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:16:46 compute-0 podman[135581]: 2025-12-03 21:16:46.2870055 +0000 UTC m=+0.162673000 container start 8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_driscoll, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:16:46 compute-0 podman[135581]: 2025-12-03 21:16:46.29002637 +0000 UTC m=+0.165693880 container attach 8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_driscoll, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:16:46 compute-0 ceph-mon[75204]: pgmap v305: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:46 compute-0 sudo[135772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uifsvujdptqzwlpxezsaxxwquvezkafw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796606.3818283-240-215411397129530/AnsiballZ_command.py'
Dec 03 21:16:46 compute-0 sudo[135772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:46 compute-0 python3.9[135780]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:16:46 compute-0 sudo[135772]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:46 compute-0 lvm[135830]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:16:46 compute-0 lvm[135829]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:16:46 compute-0 lvm[135829]: VG ceph_vg0 finished
Dec 03 21:16:46 compute-0 lvm[135830]: VG ceph_vg1 finished
Dec 03 21:16:46 compute-0 lvm[135837]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:16:46 compute-0 lvm[135837]: VG ceph_vg2 finished
Dec 03 21:16:47 compute-0 busy_driscoll[135599]: {}
Dec 03 21:16:47 compute-0 systemd[1]: libpod-8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f.scope: Deactivated successfully.
Dec 03 21:16:47 compute-0 systemd[1]: libpod-8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f.scope: Consumed 1.270s CPU time.
Dec 03 21:16:47 compute-0 podman[135581]: 2025-12-03 21:16:47.125301058 +0000 UTC m=+1.000968618 container died 8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_driscoll, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:16:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-3c4242e3905bdc3436151a07c22ca297b83ef7169ac4ef1e61279222871b5fc2-merged.mount: Deactivated successfully.
Dec 03 21:16:47 compute-0 podman[135581]: 2025-12-03 21:16:47.180047577 +0000 UTC m=+1.055715097 container remove 8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_driscoll, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:16:47 compute-0 systemd[1]: libpod-conmon-8156f29cea77848083a27a8bd92a403d597f7b97519e4480006956713a5d2b4f.scope: Deactivated successfully.
Dec 03 21:16:47 compute-0 sudo[135373]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:16:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:16:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:16:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:16:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v306: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:47 compute-0 sudo[135923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:16:47 compute-0 sudo[135923]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:16:47 compute-0 sudo[135923]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:47 compute-0 sudo[136021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znlpsdollvzgsisoqvhqgqywmhhacihj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796607.1123502-248-95006859139384/AnsiballZ_blockinfile.py'
Dec 03 21:16:47 compute-0 sudo[136021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:47 compute-0 python3.9[136023]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:47 compute-0 sudo[136021]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:16:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:16:48 compute-0 ceph-mon[75204]: pgmap v306: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:48 compute-0 sudo[136173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juvuyeicsrkpannovlmuaebloxvuysqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796607.9991262-257-247800075808538/AnsiballZ_command.py'
Dec 03 21:16:48 compute-0 sudo[136173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:48 compute-0 python3.9[136175]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:16:48 compute-0 sudo[136173]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:49 compute-0 sudo[136326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmbtpuoaifrxxdpljfdrtadcstllfcmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796608.7147255-265-267519995830068/AnsiballZ_stat.py'
Dec 03 21:16:49 compute-0 sudo[136326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:49 compute-0 python3.9[136328]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:16:49 compute-0 sudo[136326]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v307: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:49 compute-0 sudo[136480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebxerawcuehalybhueafwsmhfkconwtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796609.481256-273-153101906915686/AnsiballZ_command.py'
Dec 03 21:16:49 compute-0 sudo[136480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:49 compute-0 python3.9[136482]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:16:49 compute-0 sudo[136480]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:50 compute-0 ceph-mon[75204]: pgmap v307: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:50 compute-0 sudo[136635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfvdjcrqanqmuhlwfihawmqjuzsljkff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796610.1150692-281-208198634619726/AnsiballZ_file.py'
Dec 03 21:16:50 compute-0 sudo[136635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:50 compute-0 python3.9[136637]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:50 compute-0 sudo[136635]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v308: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:16:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:16:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:16:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:16:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:16:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:16:51 compute-0 python3.9[136787]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:16:52 compute-0 ceph-mon[75204]: pgmap v308: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:52 compute-0 sudo[136938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quyjkguaskbyjamlqlmvqogydvzpzgwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796612.4412591-321-104453362885391/AnsiballZ_command.py'
Dec 03 21:16:52 compute-0 sudo[136938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:53 compute-0 python3.9[136940]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:16:53 compute-0 ovs-vsctl[136941]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 03 21:16:53 compute-0 sudo[136938]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v309: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:53 compute-0 sudo[137091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jluobunpheehipllgngvmekohbnquxbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796613.2982852-330-129253660516712/AnsiballZ_command.py'
Dec 03 21:16:53 compute-0 sudo[137091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:53 compute-0 python3.9[137093]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:16:53 compute-0 sudo[137091]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:54 compute-0 ceph-mon[75204]: pgmap v309: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:54 compute-0 sudo[137246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttnnidfijgefhmdvwsrnkcnlyneegawd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796614.0797074-338-167467222782139/AnsiballZ_command.py'
Dec 03 21:16:54 compute-0 sudo[137246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:54 compute-0 python3.9[137248]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:16:54 compute-0 ovs-vsctl[137249]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 03 21:16:54 compute-0 sudo[137246]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v310: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:55 compute-0 python3.9[137399]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:16:55 compute-0 sudo[137551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oacvkmpqwebuexhjyujujjticurdxzix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796615.6329181-355-124278447565883/AnsiballZ_file.py'
Dec 03 21:16:55 compute-0 sudo[137551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:56 compute-0 python3.9[137553]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:16:56 compute-0 sudo[137551]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:56 compute-0 ceph-mon[75204]: pgmap v310: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:56 compute-0 sudo[137703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axoptnjmrjvpzxmnoioxwchkxxjxsfbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796616.3191106-363-129943869782259/AnsiballZ_stat.py'
Dec 03 21:16:56 compute-0 sudo[137703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:56 compute-0 python3.9[137705]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:56 compute-0 sudo[137703]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:57 compute-0 sudo[137781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwuiwakmhymadafwdmldjtrawqraklbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796616.3191106-363-129943869782259/AnsiballZ_file.py'
Dec 03 21:16:57 compute-0 sudo[137781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v311: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:57 compute-0 python3.9[137783]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:16:57 compute-0 sudo[137781]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:16:58 compute-0 sudo[137933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oszntbfthgkrdulodmaimctaoqpnkvrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796617.640904-363-102276669877999/AnsiballZ_stat.py'
Dec 03 21:16:58 compute-0 sudo[137933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:58 compute-0 python3.9[137935]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:16:58 compute-0 sudo[137933]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:58 compute-0 ceph-mon[75204]: pgmap v311: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:58 compute-0 sudo[138011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdnulhpgdalauikrpdxwgsqtraehfctm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796617.640904-363-102276669877999/AnsiballZ_file.py'
Dec 03 21:16:58 compute-0 sudo[138011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:58 compute-0 python3.9[138013]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:16:58 compute-0 sudo[138011]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:59 compute-0 sudo[138163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoaoutrrznazxzqcikbitqxtjgvchfjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796618.8504074-386-16779461808735/AnsiballZ_file.py'
Dec 03 21:16:59 compute-0 sudo[138163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:16:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v312: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:16:59 compute-0 python3.9[138165]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:16:59 compute-0 sudo[138163]: pam_unix(sudo:session): session closed for user root
Dec 03 21:16:59 compute-0 sudo[138315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spfhajihpryvcabzkjkqbxeklhwcxvhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796619.6268394-394-228960048617076/AnsiballZ_stat.py'
Dec 03 21:16:59 compute-0 sudo[138315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:00 compute-0 python3.9[138317]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:00 compute-0 sudo[138315]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:00 compute-0 sudo[138393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iusjmemxmzaxlkgnnogfinargzbmoirg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796619.6268394-394-228960048617076/AnsiballZ_file.py'
Dec 03 21:17:00 compute-0 sudo[138393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:00 compute-0 ceph-mon[75204]: pgmap v312: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:00 compute-0 python3.9[138395]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:17:00 compute-0 sudo[138393]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:01 compute-0 sudo[138545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmvcarplfloemjmugtqxvlrmwnwzbfmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796620.8416588-406-74830519814150/AnsiballZ_stat.py'
Dec 03 21:17:01 compute-0 sudo[138545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v313: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:01 compute-0 python3.9[138547]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:01 compute-0 sudo[138545]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:01 compute-0 sudo[138623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qevbvricgckcwxbxwecpcgecexhyploj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796620.8416588-406-74830519814150/AnsiballZ_file.py'
Dec 03 21:17:01 compute-0 sudo[138623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:01 compute-0 python3.9[138625]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:17:01 compute-0 sudo[138623]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:02 compute-0 sudo[138775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klnsqhtbjmtvpdnvwzknygfahpiqhgio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796622.0797267-418-42464775355271/AnsiballZ_systemd.py'
Dec 03 21:17:02 compute-0 sudo[138775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:02 compute-0 ceph-mon[75204]: pgmap v313: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:02 compute-0 python3.9[138777]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:17:02 compute-0 systemd[1]: Reloading.
Dec 03 21:17:02 compute-0 systemd-rc-local-generator[138803]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:17:02 compute-0 systemd-sysv-generator[138807]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:17:02 compute-0 sudo[138775]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v314: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:03 compute-0 sudo[138964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvnvdkpgglhbakutyghnxifkptqfmlpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796623.1213279-426-229375493813181/AnsiballZ_stat.py'
Dec 03 21:17:03 compute-0 sudo[138964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:03 compute-0 python3.9[138966]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:03 compute-0 sudo[138964]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:03 compute-0 sudo[139042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqymydtfchimemuvnqbhvoymqcxpftky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796623.1213279-426-229375493813181/AnsiballZ_file.py'
Dec 03 21:17:03 compute-0 sudo[139042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:04 compute-0 python3.9[139044]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:17:04 compute-0 sudo[139042]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:04 compute-0 ceph-mon[75204]: pgmap v314: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:04 compute-0 sudo[139194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rutgxxmhyfrsdmimgclubjvsizbhlzkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796624.352607-438-210882809263047/AnsiballZ_stat.py'
Dec 03 21:17:04 compute-0 sudo[139194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:04 compute-0 python3.9[139196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:04 compute-0 sudo[139194]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:05 compute-0 sudo[139272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oknfnaiqmvtmwahkxqrfnuxcybefwruo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796624.352607-438-210882809263047/AnsiballZ_file.py'
Dec 03 21:17:05 compute-0 sudo[139272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v315: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:05 compute-0 python3.9[139274]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:17:05 compute-0 sudo[139272]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:05 compute-0 sudo[139424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doaidmnxyykgyehpzfneasrzwwjxkxjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796625.5411332-450-103315945773635/AnsiballZ_systemd.py'
Dec 03 21:17:05 compute-0 sudo[139424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:06 compute-0 python3.9[139426]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:17:06 compute-0 systemd[1]: Reloading.
Dec 03 21:17:06 compute-0 systemd-rc-local-generator[139449]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:17:06 compute-0 systemd-sysv-generator[139455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:17:06 compute-0 ceph-mon[75204]: pgmap v315: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:06 compute-0 systemd[1]: Starting Create netns directory...
Dec 03 21:17:06 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 03 21:17:06 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 03 21:17:06 compute-0 systemd[1]: Finished Create netns directory.
Dec 03 21:17:06 compute-0 sudo[139424]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:07 compute-0 sudo[139616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnjzddriuxkjbqxcvbuthpkythuepbzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796626.8241343-460-163341435659455/AnsiballZ_file.py'
Dec 03 21:17:07 compute-0 sudo[139616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v316: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:07 compute-0 python3.9[139618]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:07 compute-0 sudo[139616]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:07 compute-0 sudo[139768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrmdgvaahuzuzrxmaenjwsufrssmrvxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796627.6422122-468-85157909291054/AnsiballZ_stat.py'
Dec 03 21:17:07 compute-0 sudo[139768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:08 compute-0 python3.9[139770]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:08 compute-0 sudo[139768]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:08 compute-0 ceph-mon[75204]: pgmap v316: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:08 compute-0 sudo[139891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trchzwdxksiprkjofmbktkfbfaktlbya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796627.6422122-468-85157909291054/AnsiballZ_copy.py'
Dec 03 21:17:08 compute-0 sudo[139891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:08 compute-0 python3.9[139893]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764796627.6422122-468-85157909291054/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:08 compute-0 sudo[139891]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v317: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:09 compute-0 sudo[140043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfkewwxhahocfzzulkhpzpsggyefiweg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796629.140023-485-226064020057176/AnsiballZ_file.py'
Dec 03 21:17:09 compute-0 sudo[140043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:09 compute-0 python3.9[140045]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:09 compute-0 sudo[140043]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:10 compute-0 sudo[140195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljtuhrfywhuwdfnhnkebvnlwoymrimif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796629.9966214-493-211087586940622/AnsiballZ_stat.py'
Dec 03 21:17:10 compute-0 sudo[140195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:10 compute-0 ceph-mon[75204]: pgmap v317: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:10 compute-0 python3.9[140197]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:10 compute-0 sudo[140195]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:11 compute-0 sudo[140318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-porcukjyroyjeutkienyqfjxazyhbpgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796629.9966214-493-211087586940622/AnsiballZ_copy.py'
Dec 03 21:17:11 compute-0 sudo[140318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:11 compute-0 python3.9[140320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764796629.9966214-493-211087586940622/.source.json _original_basename=.yibc8jrh follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:17:11 compute-0 sudo[140318]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v318: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:11 compute-0 sudo[140470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdcijzieriujyzrlhqxrnsmqqjnyvsnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796631.4416137-508-14325663722149/AnsiballZ_file.py'
Dec 03 21:17:11 compute-0 sudo[140470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:12 compute-0 python3.9[140472]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:17:12 compute-0 sudo[140470]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:12 compute-0 ceph-mon[75204]: pgmap v318: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #21. Immutable memtables: 0.
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.621330) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 21
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796632621376, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 791, "num_deletes": 251, "total_data_size": 718346, "memory_usage": 733872, "flush_reason": "Manual Compaction"}
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #22: started
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796632629612, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 22, "file_size": 458058, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6766, "largest_seqno": 7556, "table_properties": {"data_size": 454744, "index_size": 1158, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8359, "raw_average_key_size": 19, "raw_value_size": 447741, "raw_average_value_size": 1043, "num_data_blocks": 54, "num_entries": 429, "num_filter_entries": 429, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796561, "oldest_key_time": 1764796561, "file_creation_time": 1764796632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 22, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 8352 microseconds, and 3558 cpu microseconds.
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.629681) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #22: 458058 bytes OK
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.629711) [db/memtable_list.cc:519] [default] Level-0 commit table #22 started
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.631256) [db/memtable_list.cc:722] [default] Level-0 commit table #22: memtable #1 done
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.631280) EVENT_LOG_v1 {"time_micros": 1764796632631273, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.631303) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 714382, prev total WAL file size 714382, number of live WAL files 2.
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000018.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.632065) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323532' seq:0, type:0; will stop at (end)
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [22(447KB)], [20(5603KB)]
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796632632162, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [22], "files_L6": [20], "score": -1, "input_data_size": 6196198, "oldest_snapshot_seqno": -1}
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #23: 2603 keys, 4497454 bytes, temperature: kUnknown
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796632677641, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 23, "file_size": 4497454, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4478003, "index_size": 11854, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 6533, "raw_key_size": 60429, "raw_average_key_size": 23, "raw_value_size": 4429259, "raw_average_value_size": 1701, "num_data_blocks": 537, "num_entries": 2603, "num_filter_entries": 2603, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764796632, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.678030) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 4497454 bytes
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.679615) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 135.9 rd, 98.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 5.5 +0.0 blob) out(4.3 +0.0 blob), read-write-amplify(23.3) write-amplify(9.8) OK, records in: 3087, records dropped: 484 output_compression: NoCompression
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.679656) EVENT_LOG_v1 {"time_micros": 1764796632679638, "job": 6, "event": "compaction_finished", "compaction_time_micros": 45608, "compaction_time_cpu_micros": 23751, "output_level": 6, "num_output_files": 1, "total_output_size": 4497454, "num_input_records": 3087, "num_output_records": 2603, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000022.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796632679959, "job": 6, "event": "table_file_deletion", "file_number": 22}
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796632681951, "job": 6, "event": "table_file_deletion", "file_number": 20}
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.631974) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.682045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.682053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.682056) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.682059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:17:12 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:17:12.682063) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:17:12 compute-0 sudo[140622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nltqmmmikegmccyhdudukjkcjpdilqak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796632.3495512-516-189478526469474/AnsiballZ_stat.py'
Dec 03 21:17:12 compute-0 sudo[140622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:12 compute-0 sudo[140622]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v319: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:13 compute-0 sudo[140745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dilcmvzqulitrtqzejknozljusrihilo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796632.3495512-516-189478526469474/AnsiballZ_copy.py'
Dec 03 21:17:13 compute-0 sudo[140745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:13 compute-0 ceph-mon[75204]: pgmap v319: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:13 compute-0 sudo[140745]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:14 compute-0 sudo[140897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbxpoakxuodzcdsqxploqjwqedqdxyqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796634.15006-533-258576434165589/AnsiballZ_container_config_data.py'
Dec 03 21:17:14 compute-0 sudo[140897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v320: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:15 compute-0 python3.9[140899]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 03 21:17:15 compute-0 sudo[140897]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:16 compute-0 sudo[141049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fczqgttgylcbtxskafghafswhnvlfwkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796635.7566004-542-43267406525584/AnsiballZ_container_config_hash.py'
Dec 03 21:17:16 compute-0 sudo[141049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:16 compute-0 ceph-mon[75204]: pgmap v320: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:16 compute-0 python3.9[141051]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 03 21:17:16 compute-0 sudo[141049]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:17 compute-0 sudo[141201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xejbqcunxsuzfafdjiqtxmvcqmcisnrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796636.7601383-551-17900161332270/AnsiballZ_podman_container_info.py'
Dec 03 21:17:17 compute-0 sudo[141201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v321: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:17 compute-0 python3.9[141203]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 03 21:17:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:17 compute-0 sudo[141201]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:18 compute-0 ceph-mon[75204]: pgmap v321: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:19 compute-0 sudo[141380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txlplmergvniedqenisotpstqwxatkrb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764796638.4853735-564-259030971819149/AnsiballZ_edpm_container_manage.py'
Dec 03 21:17:19 compute-0 sudo[141380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v322: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:19 compute-0 python3[141382]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 03 21:17:20 compute-0 ceph-mon[75204]: pgmap v322: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:17:21
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['.mgr', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'backups', 'volumes', 'images', 'vms']
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v323: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:17:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:17:22 compute-0 ceph-mon[75204]: pgmap v323: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v324: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:24 compute-0 podman[141397]: 2025-12-03 21:17:24.347153619 +0000 UTC m=+4.814015001 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 03 21:17:24 compute-0 podman[141522]: 2025-12-03 21:17:24.524089919 +0000 UTC m=+0.074949760 container create eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller)
Dec 03 21:17:24 compute-0 podman[141522]: 2025-12-03 21:17:24.488080819 +0000 UTC m=+0.038940750 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 03 21:17:24 compute-0 python3[141382]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 03 21:17:24 compute-0 ceph-mon[75204]: pgmap v324: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:24 compute-0 sudo[141380]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v325: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:25 compute-0 sudo[141710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epthoobwdnvooqsrqnbgymsrmcntnuug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796644.964369-572-202178445489199/AnsiballZ_stat.py'
Dec 03 21:17:25 compute-0 sudo[141710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:25 compute-0 python3.9[141712]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:17:25 compute-0 sudo[141710]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:26 compute-0 sudo[141864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjamxtpjhdbwkbzdxfqlbsrebkrwdxcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796645.9020839-581-1248659914402/AnsiballZ_file.py'
Dec 03 21:17:26 compute-0 sudo[141864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:26 compute-0 python3.9[141866]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:17:26 compute-0 sudo[141864]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:26 compute-0 ceph-mon[75204]: pgmap v325: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:26 compute-0 sudo[141940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aajuntshzelectpuhsddmrfrkpodgloz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796645.9020839-581-1248659914402/AnsiballZ_stat.py'
Dec 03 21:17:26 compute-0 sudo[141940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:27 compute-0 python3.9[141942]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:17:27 compute-0 sudo[141940]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v326: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:17:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:17:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:27 compute-0 sudo[142091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imjlmcsdbrehqmbsilskoansdmonyhmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796647.1119301-581-48494299229988/AnsiballZ_copy.py'
Dec 03 21:17:27 compute-0 sudo[142091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:28 compute-0 python3.9[142093]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796647.1119301-581-48494299229988/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:17:28 compute-0 sudo[142091]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:28 compute-0 ceph-mon[75204]: pgmap v326: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:28 compute-0 sudo[142169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsamzexsobsvyiieohingaaqvtldovhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796647.1119301-581-48494299229988/AnsiballZ_systemd.py'
Dec 03 21:17:28 compute-0 sudo[142169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:28 compute-0 python3.9[142171]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 21:17:28 compute-0 systemd[1]: Reloading.
Dec 03 21:17:28 compute-0 systemd-rc-local-generator[142196]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:17:28 compute-0 systemd-sysv-generator[142200]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:17:29 compute-0 sudo[142169]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v327: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:29 compute-0 sudo[142281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrndscuqqfmkgglbuuokjcflsmitwlqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796647.1119301-581-48494299229988/AnsiballZ_systemd.py'
Dec 03 21:17:29 compute-0 sudo[142281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:29 compute-0 python3.9[142283]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:17:29 compute-0 systemd[1]: Reloading.
Dec 03 21:17:30 compute-0 systemd-rc-local-generator[142313]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:17:30 compute-0 systemd-sysv-generator[142317]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:17:30 compute-0 systemd[1]: Starting ovn_controller container...
Dec 03 21:17:30 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:17:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0939388a2c0da322f96a12ab260619ad633970d69fb83fae8368a9d8e503a2c6/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:30 compute-0 ceph-mon[75204]: pgmap v327: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:30 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b.
Dec 03 21:17:30 compute-0 podman[142325]: 2025-12-03 21:17:30.393315362 +0000 UTC m=+0.165336790 container init eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:17:30 compute-0 ovn_controller[142340]: + sudo -E kolla_set_configs
Dec 03 21:17:30 compute-0 podman[142325]: 2025-12-03 21:17:30.423067915 +0000 UTC m=+0.195089353 container start eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:17:30 compute-0 edpm-start-podman-container[142325]: ovn_controller
Dec 03 21:17:30 compute-0 systemd[1]: Created slice User Slice of UID 0.
Dec 03 21:17:30 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 03 21:17:30 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 03 21:17:30 compute-0 systemd[1]: Starting User Manager for UID 0...
Dec 03 21:17:30 compute-0 systemd[142376]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 03 21:17:30 compute-0 edpm-start-podman-container[142324]: Creating additional drop-in dependency for "ovn_controller" (eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b)
Dec 03 21:17:30 compute-0 podman[142347]: 2025-12-03 21:17:30.542332686 +0000 UTC m=+0.100139132 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:17:30 compute-0 systemd[1]: eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b-1cfe6a643379f197.service: Main process exited, code=exited, status=1/FAILURE
Dec 03 21:17:30 compute-0 systemd[1]: eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b-1cfe6a643379f197.service: Failed with result 'exit-code'.
Dec 03 21:17:30 compute-0 systemd[1]: Reloading.
Dec 03 21:17:30 compute-0 systemd-sysv-generator[142427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:17:30 compute-0 systemd-rc-local-generator[142421]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:17:30 compute-0 systemd[142376]: Queued start job for default target Main User Target.
Dec 03 21:17:30 compute-0 systemd[142376]: Created slice User Application Slice.
Dec 03 21:17:30 compute-0 systemd[142376]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 03 21:17:30 compute-0 systemd[142376]: Started Daily Cleanup of User's Temporary Directories.
Dec 03 21:17:30 compute-0 systemd[142376]: Reached target Paths.
Dec 03 21:17:30 compute-0 systemd[142376]: Reached target Timers.
Dec 03 21:17:30 compute-0 systemd[142376]: Starting D-Bus User Message Bus Socket...
Dec 03 21:17:30 compute-0 systemd[142376]: Starting Create User's Volatile Files and Directories...
Dec 03 21:17:30 compute-0 systemd[142376]: Finished Create User's Volatile Files and Directories.
Dec 03 21:17:30 compute-0 systemd[142376]: Listening on D-Bus User Message Bus Socket.
Dec 03 21:17:30 compute-0 systemd[142376]: Reached target Sockets.
Dec 03 21:17:30 compute-0 systemd[142376]: Reached target Basic System.
Dec 03 21:17:30 compute-0 systemd[142376]: Reached target Main User Target.
Dec 03 21:17:30 compute-0 systemd[142376]: Startup finished in 191ms.
Dec 03 21:17:30 compute-0 systemd[1]: Started User Manager for UID 0.
Dec 03 21:17:30 compute-0 systemd[1]: Started ovn_controller container.
Dec 03 21:17:30 compute-0 systemd[1]: Started Session c1 of User root.
Dec 03 21:17:30 compute-0 sudo[142281]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:30 compute-0 ovn_controller[142340]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 03 21:17:30 compute-0 ovn_controller[142340]: INFO:__main__:Validating config file
Dec 03 21:17:30 compute-0 ovn_controller[142340]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 03 21:17:30 compute-0 ovn_controller[142340]: INFO:__main__:Writing out command to execute
Dec 03 21:17:30 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 03 21:17:30 compute-0 ovn_controller[142340]: ++ cat /run_command
Dec 03 21:17:30 compute-0 ovn_controller[142340]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 03 21:17:30 compute-0 ovn_controller[142340]: + ARGS=
Dec 03 21:17:30 compute-0 ovn_controller[142340]: + sudo kolla_copy_cacerts
Dec 03 21:17:31 compute-0 systemd[1]: Started Session c2 of User root.
Dec 03 21:17:31 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 03 21:17:31 compute-0 ovn_controller[142340]: + [[ ! -n '' ]]
Dec 03 21:17:31 compute-0 ovn_controller[142340]: + . kolla_extend_start
Dec 03 21:17:31 compute-0 ovn_controller[142340]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 03 21:17:31 compute-0 ovn_controller[142340]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 03 21:17:31 compute-0 ovn_controller[142340]: + umask 0022
Dec 03 21:17:31 compute-0 ovn_controller[142340]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 03 21:17:31 compute-0 NetworkManager[48996]: <info>  [1764796651.0960] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec 03 21:17:31 compute-0 NetworkManager[48996]: <info>  [1764796651.0967] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 03 21:17:31 compute-0 NetworkManager[48996]: <info>  [1764796651.0978] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 03 21:17:31 compute-0 NetworkManager[48996]: <info>  [1764796651.0984] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec 03 21:17:31 compute-0 NetworkManager[48996]: <info>  [1764796651.0987] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 03 21:17:31 compute-0 kernel: br-int: entered promiscuous mode
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00019|main|INFO|OVS feature set changed, force recompute.
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 03 21:17:31 compute-0 ovn_controller[142340]: 2025-12-03T21:17:31Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 03 21:17:31 compute-0 NetworkManager[48996]: <info>  [1764796651.1248] manager: (ovn-5d60bc-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 03 21:17:31 compute-0 systemd-udevd[142471]: Network interface NamePolicy= disabled on kernel command line.
Dec 03 21:17:31 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Dec 03 21:17:31 compute-0 NetworkManager[48996]: <info>  [1764796651.1574] device (genev_sys_6081): carrier: link connected
Dec 03 21:17:31 compute-0 NetworkManager[48996]: <info>  [1764796651.1579] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Dec 03 21:17:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v328: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:31 compute-0 sudo[142600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igqfknfhmqvejhzcydbqbxqepczgrwnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796651.1894705-609-212491529938929/AnsiballZ_command.py'
Dec 03 21:17:31 compute-0 sudo[142600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:31 compute-0 python3.9[142602]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:17:31 compute-0 ovs-vsctl[142603]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 03 21:17:31 compute-0 sudo[142600]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:32 compute-0 ceph-mon[75204]: pgmap v328: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:32 compute-0 sudo[142753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyghoawkzejsnimjvgjmbullswjycpws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796652.0987213-617-216253683205171/AnsiballZ_command.py'
Dec 03 21:17:32 compute-0 sudo[142753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:32 compute-0 python3.9[142755]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:17:32 compute-0 ovs-vsctl[142757]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 03 21:17:32 compute-0 sudo[142753]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v329: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:33 compute-0 sudo[142908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfwkbxmvsdldrkbsyhssmvmetcjmnvte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796653.2434711-631-80876186751942/AnsiballZ_command.py'
Dec 03 21:17:33 compute-0 sudo[142908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:33 compute-0 python3.9[142910]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:17:33 compute-0 ovs-vsctl[142911]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 03 21:17:33 compute-0 sudo[142908]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:34 compute-0 sshd-session[131114]: Connection closed by 192.168.122.30 port 38152
Dec 03 21:17:34 compute-0 sshd-session[131111]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:17:34 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Dec 03 21:17:34 compute-0 systemd[1]: session-45.scope: Consumed 1min 4.123s CPU time.
Dec 03 21:17:34 compute-0 systemd-logind[787]: Session 45 logged out. Waiting for processes to exit.
Dec 03 21:17:34 compute-0 systemd-logind[787]: Removed session 45.
Dec 03 21:17:34 compute-0 ceph-mon[75204]: pgmap v329: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v330: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:36 compute-0 ceph-mon[75204]: pgmap v330: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v331: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:38 compute-0 ceph-mon[75204]: pgmap v331: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v332: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:39 compute-0 sshd-session[142936]: Accepted publickey for zuul from 192.168.122.30 port 50544 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:17:39 compute-0 systemd-logind[787]: New session 47 of user zuul.
Dec 03 21:17:39 compute-0 systemd[1]: Started Session 47 of User zuul.
Dec 03 21:17:39 compute-0 sshd-session[142936]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:17:40 compute-0 ceph-mon[75204]: pgmap v332: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:40 compute-0 python3.9[143089]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:17:41 compute-0 systemd[1]: Stopping User Manager for UID 0...
Dec 03 21:17:41 compute-0 systemd[142376]: Activating special unit Exit the Session...
Dec 03 21:17:41 compute-0 systemd[142376]: Stopped target Main User Target.
Dec 03 21:17:41 compute-0 systemd[142376]: Stopped target Basic System.
Dec 03 21:17:41 compute-0 systemd[142376]: Stopped target Paths.
Dec 03 21:17:41 compute-0 systemd[142376]: Stopped target Sockets.
Dec 03 21:17:41 compute-0 systemd[142376]: Stopped target Timers.
Dec 03 21:17:41 compute-0 systemd[142376]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 03 21:17:41 compute-0 systemd[142376]: Closed D-Bus User Message Bus Socket.
Dec 03 21:17:41 compute-0 systemd[142376]: Stopped Create User's Volatile Files and Directories.
Dec 03 21:17:41 compute-0 systemd[142376]: Removed slice User Application Slice.
Dec 03 21:17:41 compute-0 systemd[142376]: Reached target Shutdown.
Dec 03 21:17:41 compute-0 systemd[142376]: Finished Exit the Session.
Dec 03 21:17:41 compute-0 systemd[142376]: Reached target Exit the Session.
Dec 03 21:17:41 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Dec 03 21:17:41 compute-0 systemd[1]: Stopped User Manager for UID 0.
Dec 03 21:17:41 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 03 21:17:41 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 03 21:17:41 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 03 21:17:41 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 03 21:17:41 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Dec 03 21:17:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v333: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:41 compute-0 sudo[143244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtwpbfogwoivzwtanssftklhkbpoloop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796661.0522273-34-203616828639143/AnsiballZ_file.py'
Dec 03 21:17:41 compute-0 sudo[143244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:41 compute-0 python3.9[143246]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:41 compute-0 sudo[143244]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:42 compute-0 ceph-mon[75204]: pgmap v333: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:42 compute-0 sudo[143396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eujiwalfnvwewtaxljdrjyxtkgauqgzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796662.078564-34-142647196375468/AnsiballZ_file.py'
Dec 03 21:17:42 compute-0 sudo[143396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:42 compute-0 python3.9[143398]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:42 compute-0 sudo[143396]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v334: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:43 compute-0 sudo[143548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rskghyhxoqksohrajhvofuvbobbfrtlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796662.9361448-34-46300629490587/AnsiballZ_file.py'
Dec 03 21:17:43 compute-0 sudo[143548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:43 compute-0 python3.9[143550]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:43 compute-0 sudo[143548]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:44 compute-0 sudo[143700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mscruvogokluawenyyvnzacpflyeqlwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796663.8211184-34-72594474690449/AnsiballZ_file.py'
Dec 03 21:17:44 compute-0 sudo[143700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:44 compute-0 python3.9[143702]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:44 compute-0 sudo[143700]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:44 compute-0 ceph-mon[75204]: pgmap v334: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:44 compute-0 sudo[143852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fecfrrswccvukrjlcviojjaucdawbigp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796664.5457804-34-272788245097489/AnsiballZ_file.py'
Dec 03 21:17:44 compute-0 sudo[143852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:45 compute-0 python3.9[143854]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:45 compute-0 sudo[143852]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v335: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:46 compute-0 python3.9[144004]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:17:46 compute-0 ceph-mon[75204]: pgmap v335: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:46 compute-0 sudo[144154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyemttilzqwjmtcqxuhrwequtupmlvds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796666.2440693-78-56863242216152/AnsiballZ_seboolean.py'
Dec 03 21:17:46 compute-0 sudo[144154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:46 compute-0 python3.9[144156]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 03 21:17:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v336: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:47 compute-0 sudo[144157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:17:47 compute-0 sudo[144157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:47 compute-0 sudo[144157]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:47 compute-0 sudo[144182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:17:47 compute-0 sudo[144182]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:47 compute-0 sudo[144154]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:48 compute-0 sudo[144182]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:17:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:17:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:17:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:17:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:17:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:17:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:17:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:17:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:17:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:17:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:17:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:17:48 compute-0 sudo[144355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:17:48 compute-0 sudo[144355]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:48 compute-0 sudo[144355]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:48 compute-0 sudo[144410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:17:48 compute-0 sudo[144410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:48 compute-0 ceph-mon[75204]: pgmap v336: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:17:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:17:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:17:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:17:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:17:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:17:48 compute-0 python3.9[144413]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:48 compute-0 podman[144489]: 2025-12-03 21:17:48.801522837 +0000 UTC m=+0.067278070 container create 8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_albattani, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default)
Dec 03 21:17:48 compute-0 systemd[1]: Started libpod-conmon-8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325.scope.
Dec 03 21:17:48 compute-0 podman[144489]: 2025-12-03 21:17:48.777346864 +0000 UTC m=+0.043102177 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:17:48 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:17:48 compute-0 podman[144489]: 2025-12-03 21:17:48.895996158 +0000 UTC m=+0.161751421 container init 8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_albattani, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:17:48 compute-0 podman[144489]: 2025-12-03 21:17:48.902943512 +0000 UTC m=+0.168698775 container start 8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_albattani, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:17:48 compute-0 podman[144489]: 2025-12-03 21:17:48.907948096 +0000 UTC m=+0.173703369 container attach 8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_albattani, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:17:48 compute-0 pensive_albattani[144513]: 167 167
Dec 03 21:17:48 compute-0 systemd[1]: libpod-8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325.scope: Deactivated successfully.
Dec 03 21:17:48 compute-0 podman[144489]: 2025-12-03 21:17:48.909818596 +0000 UTC m=+0.175573909 container died 8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_albattani, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:17:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-fef0a4f41563aa276d01f1e778752122b9c081eef67080297386070307aea73f-merged.mount: Deactivated successfully.
Dec 03 21:17:48 compute-0 podman[144489]: 2025-12-03 21:17:48.967762995 +0000 UTC m=+0.233518228 container remove 8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_albattani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=tentacle)
Dec 03 21:17:48 compute-0 systemd[1]: libpod-conmon-8f8a02e3e582f537a2564613bd5c1687d28e3735f2c59f79356fa771275f2325.scope: Deactivated successfully.
Dec 03 21:17:49 compute-0 podman[144594]: 2025-12-03 21:17:49.157059978 +0000 UTC m=+0.047951786 container create b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_cohen, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:17:49 compute-0 systemd[1]: Started libpod-conmon-b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a.scope.
Dec 03 21:17:49 compute-0 podman[144594]: 2025-12-03 21:17:49.134983941 +0000 UTC m=+0.025875789 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:17:49 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:17:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714e1a4b190c59c0ff1a61e5cfce80e88c2588ab707c717df3e533ed7c7e530b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714e1a4b190c59c0ff1a61e5cfce80e88c2588ab707c717df3e533ed7c7e530b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714e1a4b190c59c0ff1a61e5cfce80e88c2588ab707c717df3e533ed7c7e530b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714e1a4b190c59c0ff1a61e5cfce80e88c2588ab707c717df3e533ed7c7e530b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/714e1a4b190c59c0ff1a61e5cfce80e88c2588ab707c717df3e533ed7c7e530b/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:49 compute-0 podman[144594]: 2025-12-03 21:17:49.245105538 +0000 UTC m=+0.135997356 container init b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_cohen, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:17:49 compute-0 podman[144594]: 2025-12-03 21:17:49.254476757 +0000 UTC m=+0.145368545 container start b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_cohen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:17:49 compute-0 podman[144594]: 2025-12-03 21:17:49.257836347 +0000 UTC m=+0.148728165 container attach b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_cohen, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS)
Dec 03 21:17:49 compute-0 python3.9[144623]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764796667.8076274-86-262003181870701/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v337: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:49 compute-0 gallant_cohen[144631]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:17:49 compute-0 gallant_cohen[144631]: --> All data devices are unavailable
Dec 03 21:17:49 compute-0 systemd[1]: libpod-b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a.scope: Deactivated successfully.
Dec 03 21:17:49 compute-0 podman[144594]: 2025-12-03 21:17:49.828717992 +0000 UTC m=+0.719609810 container died b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_cohen, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:17:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-714e1a4b190c59c0ff1a61e5cfce80e88c2588ab707c717df3e533ed7c7e530b-merged.mount: Deactivated successfully.
Dec 03 21:17:49 compute-0 podman[144594]: 2025-12-03 21:17:49.881164175 +0000 UTC m=+0.772055993 container remove b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_cohen, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:17:49 compute-0 systemd[1]: libpod-conmon-b051a431daf0e05e17425c2f7c7291b76f27a27baab9cff7daf82dad39c15e9a.scope: Deactivated successfully.
Dec 03 21:17:49 compute-0 sudo[144410]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:49 compute-0 sudo[144812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:17:49 compute-0 python3.9[144800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:49 compute-0 sudo[144812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:50 compute-0 sudo[144812]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:50 compute-0 sudo[144837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:17:50 compute-0 sudo[144837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:50 compute-0 podman[144944]: 2025-12-03 21:17:50.358755901 +0000 UTC m=+0.034145438 container create 3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_mirzakhani, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:17:50 compute-0 systemd[1]: Started libpod-conmon-3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef.scope.
Dec 03 21:17:50 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:17:50 compute-0 podman[144944]: 2025-12-03 21:17:50.343920806 +0000 UTC m=+0.019310363 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:17:50 compute-0 podman[144944]: 2025-12-03 21:17:50.449215576 +0000 UTC m=+0.124605183 container init 3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 03 21:17:50 compute-0 podman[144944]: 2025-12-03 21:17:50.46143575 +0000 UTC m=+0.136825287 container start 3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_mirzakhani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:17:50 compute-0 focused_mirzakhani[145000]: 167 167
Dec 03 21:17:50 compute-0 systemd[1]: libpod-3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef.scope: Deactivated successfully.
Dec 03 21:17:50 compute-0 podman[144944]: 2025-12-03 21:17:50.466847064 +0000 UTC m=+0.142236621 container attach 3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_mirzakhani, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:17:50 compute-0 podman[144944]: 2025-12-03 21:17:50.468278253 +0000 UTC m=+0.143667810 container died 3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_mirzakhani, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:17:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-4de3a05f17e46a6019de05efe5e5a266cd35852ea5373c116eedc845c7d58535-merged.mount: Deactivated successfully.
Dec 03 21:17:50 compute-0 podman[144944]: 2025-12-03 21:17:50.504607608 +0000 UTC m=+0.179997135 container remove 3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_mirzakhani, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:17:50 compute-0 systemd[1]: libpod-conmon-3fcf985f72f222a3906ab521b0a15dff305e9c131fd1b8bfa25dc16796cf4bef.scope: Deactivated successfully.
Dec 03 21:17:50 compute-0 ceph-mon[75204]: pgmap v337: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:50 compute-0 python3.9[145013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764796669.5445263-101-62394651386676/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:50 compute-0 podman[145036]: 2025-12-03 21:17:50.71835825 +0000 UTC m=+0.068353878 container create 4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bell, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:17:50 compute-0 systemd[1]: Started libpod-conmon-4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676.scope.
Dec 03 21:17:50 compute-0 podman[145036]: 2025-12-03 21:17:50.693390146 +0000 UTC m=+0.043385814 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:17:50 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:17:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4da9b94c860a9d8d8d135f9f5b7ef7e7cb1351f6c09ab93213c7f9313477770d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4da9b94c860a9d8d8d135f9f5b7ef7e7cb1351f6c09ab93213c7f9313477770d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4da9b94c860a9d8d8d135f9f5b7ef7e7cb1351f6c09ab93213c7f9313477770d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4da9b94c860a9d8d8d135f9f5b7ef7e7cb1351f6c09ab93213c7f9313477770d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:50 compute-0 podman[145036]: 2025-12-03 21:17:50.810187511 +0000 UTC m=+0.160183169 container init 4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bell, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:17:50 compute-0 podman[145036]: 2025-12-03 21:17:50.822676893 +0000 UTC m=+0.172672551 container start 4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bell, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 03 21:17:50 compute-0 podman[145036]: 2025-12-03 21:17:50.828178569 +0000 UTC m=+0.178174197 container attach 4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bell, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:17:51 compute-0 dazzling_bell[145076]: {
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:     "0": [
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:         {
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "devices": [
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "/dev/loop3"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             ],
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_name": "ceph_lv0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_size": "21470642176",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "name": "ceph_lv0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "tags": {
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cluster_name": "ceph",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.crush_device_class": "",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.encrypted": "0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.objectstore": "bluestore",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osd_id": "0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.type": "block",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.vdo": "0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.with_tpm": "0"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             },
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "type": "block",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "vg_name": "ceph_vg0"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:         }
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:     ],
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:     "1": [
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:         {
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "devices": [
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "/dev/loop4"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             ],
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_name": "ceph_lv1",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_size": "21470642176",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "name": "ceph_lv1",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "tags": {
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cluster_name": "ceph",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.crush_device_class": "",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.encrypted": "0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.objectstore": "bluestore",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osd_id": "1",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.type": "block",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.vdo": "0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.with_tpm": "0"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             },
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "type": "block",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "vg_name": "ceph_vg1"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:         }
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:     ],
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:     "2": [
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:         {
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "devices": [
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "/dev/loop5"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             ],
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_name": "ceph_lv2",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_size": "21470642176",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "name": "ceph_lv2",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "tags": {
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.cluster_name": "ceph",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.crush_device_class": "",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.encrypted": "0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.objectstore": "bluestore",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osd_id": "2",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.type": "block",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.vdo": "0",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:                 "ceph.with_tpm": "0"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             },
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "type": "block",
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:             "vg_name": "ceph_vg2"
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:         }
Dec 03 21:17:51 compute-0 dazzling_bell[145076]:     ]
Dec 03 21:17:51 compute-0 dazzling_bell[145076]: }
Dec 03 21:17:51 compute-0 systemd[1]: libpod-4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676.scope: Deactivated successfully.
Dec 03 21:17:51 compute-0 podman[145036]: 2025-12-03 21:17:51.180129275 +0000 UTC m=+0.530124923 container died 4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bell, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle)
Dec 03 21:17:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-4da9b94c860a9d8d8d135f9f5b7ef7e7cb1351f6c09ab93213c7f9313477770d-merged.mount: Deactivated successfully.
Dec 03 21:17:51 compute-0 podman[145036]: 2025-12-03 21:17:51.261853637 +0000 UTC m=+0.611849295 container remove 4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_bell, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:17:51 compute-0 systemd[1]: libpod-conmon-4e4ac3c03ab60fe859ff8aef3e8419d5674db81474fc7e7ec6a8be62e7f70676.scope: Deactivated successfully.
Dec 03 21:17:51 compute-0 sudo[144837]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:51 compute-0 sudo[145221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nomitirxtdnbwjuegwmomcrfhpjjwkmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796670.9405851-118-230305352470476/AnsiballZ_setup.py'
Dec 03 21:17:51 compute-0 sudo[145221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v338: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:51 compute-0 sudo[145224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:17:51 compute-0 sudo[145224]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:51 compute-0 sudo[145224]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:51 compute-0 sudo[145249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:17:51 compute-0 sudo[145249]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:51 compute-0 python3.9[145223]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:17:51 compute-0 podman[145295]: 2025-12-03 21:17:51.765864015 +0000 UTC m=+0.070971497 container create fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_joliot, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:17:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:17:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:17:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:17:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:17:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:17:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:17:51 compute-0 sudo[145221]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:51 compute-0 systemd[1]: Started libpod-conmon-fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f.scope.
Dec 03 21:17:51 compute-0 podman[145295]: 2025-12-03 21:17:51.735458737 +0000 UTC m=+0.040566259 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:17:51 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:17:51 compute-0 podman[145295]: 2025-12-03 21:17:51.859597827 +0000 UTC m=+0.164705289 container init fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_joliot, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS)
Dec 03 21:17:51 compute-0 podman[145295]: 2025-12-03 21:17:51.867315882 +0000 UTC m=+0.172423324 container start fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_joliot, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:17:51 compute-0 podman[145295]: 2025-12-03 21:17:51.871191755 +0000 UTC m=+0.176299197 container attach fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_joliot, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 03 21:17:51 compute-0 sweet_joliot[145312]: 167 167
Dec 03 21:17:51 compute-0 systemd[1]: libpod-fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f.scope: Deactivated successfully.
Dec 03 21:17:51 compute-0 podman[145295]: 2025-12-03 21:17:51.874595736 +0000 UTC m=+0.179703188 container died fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_joliot, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:17:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-6e626d4e0be2861fe82b40b15d95e36712c9d6b1840e793d499f91f805de1b79-merged.mount: Deactivated successfully.
Dec 03 21:17:51 compute-0 podman[145295]: 2025-12-03 21:17:51.918074641 +0000 UTC m=+0.223182093 container remove fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_joliot, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:17:51 compute-0 systemd[1]: libpod-conmon-fd99bd03a3f3d1b162289898e3a64ee5bc076fc7cd9b8a6ca5555261401fdf7f.scope: Deactivated successfully.
Dec 03 21:17:52 compute-0 podman[145336]: 2025-12-03 21:17:52.15881053 +0000 UTC m=+0.073291509 container create 7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_shamir, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:17:52 compute-0 podman[145336]: 2025-12-03 21:17:52.12831886 +0000 UTC m=+0.042799899 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:17:52 compute-0 systemd[1]: Started libpod-conmon-7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366.scope.
Dec 03 21:17:52 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:17:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd0917d8fa3f13220b2276e3d08590714fb8a2802e39d2769e7fbe30e5dfc076/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd0917d8fa3f13220b2276e3d08590714fb8a2802e39d2769e7fbe30e5dfc076/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd0917d8fa3f13220b2276e3d08590714fb8a2802e39d2769e7fbe30e5dfc076/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd0917d8fa3f13220b2276e3d08590714fb8a2802e39d2769e7fbe30e5dfc076/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:17:52 compute-0 podman[145336]: 2025-12-03 21:17:52.310988906 +0000 UTC m=+0.225469855 container init 7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_shamir, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:17:52 compute-0 podman[145336]: 2025-12-03 21:17:52.322883992 +0000 UTC m=+0.237364981 container start 7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_shamir, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:17:52 compute-0 podman[145336]: 2025-12-03 21:17:52.326988121 +0000 UTC m=+0.241469070 container attach 7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_shamir, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:17:52 compute-0 sudo[145431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtcqnkamwelllvxmlswjsjsijzuipuge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796670.9405851-118-230305352470476/AnsiballZ_dnf.py'
Dec 03 21:17:52 compute-0 sudo[145431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:52 compute-0 ceph-mon[75204]: pgmap v338: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:52 compute-0 python3.9[145433]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:17:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:52 compute-0 lvm[145509]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:17:52 compute-0 lvm[145509]: VG ceph_vg0 finished
Dec 03 21:17:52 compute-0 lvm[145510]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:17:52 compute-0 lvm[145510]: VG ceph_vg1 finished
Dec 03 21:17:52 compute-0 lvm[145511]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:17:52 compute-0 lvm[145511]: VG ceph_vg2 finished
Dec 03 21:17:53 compute-0 exciting_shamir[145400]: {}
Dec 03 21:17:53 compute-0 systemd[1]: libpod-7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366.scope: Deactivated successfully.
Dec 03 21:17:53 compute-0 podman[145336]: 2025-12-03 21:17:53.110609161 +0000 UTC m=+1.025090100 container died 7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_shamir, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3)
Dec 03 21:17:53 compute-0 systemd[1]: libpod-7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366.scope: Consumed 1.365s CPU time.
Dec 03 21:17:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-fd0917d8fa3f13220b2276e3d08590714fb8a2802e39d2769e7fbe30e5dfc076-merged.mount: Deactivated successfully.
Dec 03 21:17:53 compute-0 podman[145336]: 2025-12-03 21:17:53.151851198 +0000 UTC m=+1.066332137 container remove 7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_shamir, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:17:53 compute-0 systemd[1]: libpod-conmon-7c8b4c2fc2dc2bee15f856c2aac4705f80193bb6764b04673810141865d95366.scope: Deactivated successfully.
Dec 03 21:17:53 compute-0 sudo[145249]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:17:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:17:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:17:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:17:53 compute-0 sudo[145526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:17:53 compute-0 sudo[145526]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:17:53 compute-0 sudo[145526]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v339: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:17:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:17:54 compute-0 ceph-mon[75204]: pgmap v339: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:54 compute-0 sudo[145431]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v340: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:55 compute-0 sudo[145700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evgzftkoixvtxuyityjcuqqzkzewrsvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796674.7368643-130-9357648195172/AnsiballZ_systemd.py'
Dec 03 21:17:55 compute-0 sudo[145700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:17:55 compute-0 python3.9[145702]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:17:55 compute-0 sudo[145700]: pam_unix(sudo:session): session closed for user root
Dec 03 21:17:56 compute-0 ceph-mon[75204]: pgmap v340: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:56 compute-0 python3.9[145855]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:57 compute-0 python3.9[145976]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764796676.0173767-138-50792853949407/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v341: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:17:57 compute-0 python3.9[146126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:17:58 compute-0 ceph-mon[75204]: pgmap v341: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:58 compute-0 python3.9[146247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764796677.3642173-138-32180358981153/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:17:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v342: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:59 compute-0 ceph-mon[75204]: pgmap v342: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:17:59 compute-0 python3.9[146397]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:00 compute-0 python3.9[146518]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764796679.2960143-182-202471294626948/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:18:00 compute-0 ovn_controller[142340]: 2025-12-03T21:18:00Z|00025|memory|INFO|16256 kB peak resident set size after 29.7 seconds
Dec 03 21:18:00 compute-0 ovn_controller[142340]: 2025-12-03T21:18:00Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Dec 03 21:18:00 compute-0 podman[146519]: 2025-12-03 21:18:00.735058256 +0000 UTC m=+0.127514671 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 21:18:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v343: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:01 compute-0 python3.9[146695]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:01 compute-0 anacron[34192]: Job `cron.daily' started
Dec 03 21:18:01 compute-0 anacron[34192]: Job `cron.daily' terminated
Dec 03 21:18:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:18:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 1811 writes, 7785 keys, 1811 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.01 MB/s
                                           Cumulative WAL: 1811 writes, 1811 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1811 writes, 7785 keys, 1811 commit groups, 1.0 writes per commit group, ingest: 8.44 MB, 0.01 MB/s
                                           Interval WAL: 1811 writes, 1811 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     92.4      0.06              0.02         3    0.021       0      0       0.0       0.0
                                             L6      1/0    4.29 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.6    130.8    111.8      0.09              0.04         2    0.044    6034    773       0.0       0.0
                                            Sum      1/0    4.29 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.6     75.3    103.5      0.15              0.06         5    0.030    6034    773       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.7     78.1    107.0      0.15              0.06         4    0.037    6034    773       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    130.8    111.8      0.09              0.04         2    0.044    6034    773       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    100.0      0.06              0.02         2    0.029       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     10.4      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.006, interval 0.006
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.02 GB write, 0.03 MB/s write, 0.01 GB read, 0.02 MB/s read, 0.2 seconds
                                           Interval compaction: 0.02 GB write, 0.03 MB/s write, 0.01 GB read, 0.02 MB/s read, 0.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56170d6e38d0#2 capacity: 308.00 MB usage: 510.84 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(37,440.69 KB,0.139727%) FilterBlock(6,24.23 KB,0.00768389%) IndexBlock(6,45.92 KB,0.0145603%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 03 21:18:01 compute-0 python3.9[146818]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764796680.789668-182-196313411724688/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:18:02 compute-0 ceph-mon[75204]: pgmap v343: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:02 compute-0 python3.9[146968]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:18:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v344: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:03 compute-0 sudo[147120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhtcusxyfsichlpmczcfovyuibtnpder ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796683.0351386-220-103752403460944/AnsiballZ_file.py'
Dec 03 21:18:03 compute-0 sudo[147120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:03 compute-0 python3.9[147122]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:18:03 compute-0 sudo[147120]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:04 compute-0 sudo[147272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaoisovxofkqphozrvabdnlnuwfcfcsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796683.8880002-228-52715379571926/AnsiballZ_stat.py'
Dec 03 21:18:04 compute-0 sudo[147272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:04 compute-0 python3.9[147274]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:04 compute-0 sudo[147272]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:04 compute-0 ceph-mon[75204]: pgmap v344: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:04 compute-0 sudo[147350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piuzonmhprukwytinsoarnofbgsisknu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796683.8880002-228-52715379571926/AnsiballZ_file.py'
Dec 03 21:18:04 compute-0 sudo[147350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:05 compute-0 python3.9[147352]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:18:05 compute-0 sudo[147350]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v345: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:05 compute-0 sudo[147502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zysvxcgkmxlvwxudxvxivfcuxzmuzjlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796685.2466493-228-26976360087708/AnsiballZ_stat.py'
Dec 03 21:18:05 compute-0 sudo[147502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:05 compute-0 python3.9[147504]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:05 compute-0 sudo[147502]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:06 compute-0 sudo[147580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuxzvbkgpiincskrbitszlsczbgrieio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796685.2466493-228-26976360087708/AnsiballZ_file.py'
Dec 03 21:18:06 compute-0 sudo[147580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:06 compute-0 python3.9[147582]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:18:06 compute-0 sudo[147580]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:06 compute-0 ceph-mon[75204]: pgmap v345: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:06 compute-0 sudo[147732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxqnlkwcsyorppbyzywnbbacjrrlexoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796686.518685-251-173647089433998/AnsiballZ_file.py'
Dec 03 21:18:06 compute-0 sudo[147732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:07 compute-0 python3.9[147734]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:07 compute-0 sudo[147732]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v346: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:07 compute-0 sudo[147884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwaagjitdtbitvlhbzivughpvfwtbwxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796687.2342393-259-164718940363934/AnsiballZ_stat.py'
Dec 03 21:18:07 compute-0 sudo[147884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:07 compute-0 python3.9[147886]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:07 compute-0 sudo[147884]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:08 compute-0 sudo[147962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-optxcmktgcqpnsbyudfsdxtwjysbkvgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796687.2342393-259-164718940363934/AnsiballZ_file.py'
Dec 03 21:18:08 compute-0 sudo[147962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:08 compute-0 python3.9[147964]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:08 compute-0 sudo[147962]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:08 compute-0 ceph-mon[75204]: pgmap v346: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:08 compute-0 sudo[148114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjkkwiyomljxejrdkslxyzluxqhmorle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796688.4460144-271-171850381464037/AnsiballZ_stat.py'
Dec 03 21:18:08 compute-0 sudo[148114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:09 compute-0 python3.9[148116]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:09 compute-0 sudo[148114]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v347: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:09 compute-0 sudo[148192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkqkcdehwutysdyqbhjhdhqbmopvbykk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796688.4460144-271-171850381464037/AnsiballZ_file.py'
Dec 03 21:18:09 compute-0 sudo[148192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:09 compute-0 python3.9[148194]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:09 compute-0 sudo[148192]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:10 compute-0 sudo[148344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpmuvfcphcynbvzpijadfxtaetflrejb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796689.8705401-283-88447715545689/AnsiballZ_systemd.py'
Dec 03 21:18:10 compute-0 sudo[148344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:10 compute-0 python3.9[148346]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:18:10 compute-0 systemd[1]: Reloading.
Dec 03 21:18:10 compute-0 systemd-sysv-generator[148377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:18:10 compute-0 systemd-rc-local-generator[148374]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:18:10 compute-0 ceph-mon[75204]: pgmap v347: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:10 compute-0 sudo[148344]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v348: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:11 compute-0 sudo[148533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmcpggkjdbzrzeevixxrrzoxeovwgqvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796691.1835275-291-125863164305175/AnsiballZ_stat.py'
Dec 03 21:18:11 compute-0 sudo[148533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:11 compute-0 ceph-mon[75204]: pgmap v348: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:11 compute-0 python3.9[148535]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:11 compute-0 sudo[148533]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:12 compute-0 sudo[148611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzsnmndyzaqvqaiyzqdxqopnqltlkclo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796691.1835275-291-125863164305175/AnsiballZ_file.py'
Dec 03 21:18:12 compute-0 sudo[148611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:12 compute-0 python3.9[148613]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:12 compute-0 sudo[148611]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:12 compute-0 sudo[148763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzdudfzfmpqrfduqjaqajvzhumpqsrby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796692.601648-303-151730342933797/AnsiballZ_stat.py'
Dec 03 21:18:12 compute-0 sudo[148763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:13 compute-0 python3.9[148765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:13 compute-0 sudo[148763]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v349: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:13 compute-0 sudo[148841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-colpmvuwjdpmllwaokieokrwndkqpjfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796692.601648-303-151730342933797/AnsiballZ_file.py'
Dec 03 21:18:13 compute-0 sudo[148841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:13 compute-0 python3.9[148843]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:13 compute-0 sudo[148841]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:14 compute-0 ceph-mon[75204]: pgmap v349: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v350: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:15 compute-0 sudo[148993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpkapepaxnbamsdllhtwdehtarpesjss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796693.9928339-315-261807059454201/AnsiballZ_systemd.py'
Dec 03 21:18:15 compute-0 sudo[148993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:16 compute-0 python3.9[148995]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:18:16 compute-0 systemd[1]: Reloading.
Dec 03 21:18:16 compute-0 systemd-rc-local-generator[149026]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:18:16 compute-0 systemd-sysv-generator[149030]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:18:16 compute-0 ceph-mon[75204]: pgmap v350: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:16 compute-0 systemd[1]: Starting Create netns directory...
Dec 03 21:18:16 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 03 21:18:16 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 03 21:18:16 compute-0 systemd[1]: Finished Create netns directory.
Dec 03 21:18:16 compute-0 sudo[148993]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:17 compute-0 sudo[149187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocxflifybalukfnqtajtunwrhyolbpgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796696.923111-325-127793216908592/AnsiballZ_file.py'
Dec 03 21:18:17 compute-0 sudo[149187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v351: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:17 compute-0 python3.9[149189]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:18:17 compute-0 sudo[149187]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:18 compute-0 sudo[149339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgnbafkudoaqyobhplpclrbfkalezqal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796697.72217-333-100123149994289/AnsiballZ_stat.py'
Dec 03 21:18:18 compute-0 sudo[149339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:18 compute-0 python3.9[149341]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:18 compute-0 sudo[149339]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:18 compute-0 ceph-mon[75204]: pgmap v351: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:18 compute-0 sudo[149462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skldbcdxoeuxglmxplllltylfjguxzoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796697.72217-333-100123149994289/AnsiballZ_copy.py'
Dec 03 21:18:18 compute-0 sudo[149462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:18 compute-0 python3.9[149464]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764796697.72217-333-100123149994289/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:18:18 compute-0 sudo[149462]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v352: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:19 compute-0 sudo[149614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdfkdvejceogblstvmpdktpdbggsxcsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796699.4042664-350-32455996241869/AnsiballZ_file.py'
Dec 03 21:18:19 compute-0 sudo[149614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:19 compute-0 python3.9[149616]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:18:19 compute-0 sudo[149614]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:20 compute-0 ceph-mon[75204]: pgmap v352: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:20 compute-0 sudo[149766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljmqszfhcksodbfkqzgsynahokwdprnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796700.206883-358-230066662419595/AnsiballZ_stat.py'
Dec 03 21:18:20 compute-0 sudo[149766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:20 compute-0 python3.9[149768]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:18:20 compute-0 sudo[149766]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:18:21
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['volumes', 'cephfs.cephfs.data', 'cephfs.cephfs.meta', 'images', '.mgr', 'backups', 'vms']
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:18:21 compute-0 sudo[149889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcglfbvipihttaxibavrbopwxadelken ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796700.206883-358-230066662419595/AnsiballZ_copy.py'
Dec 03 21:18:21 compute-0 sudo[149889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v353: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:21 compute-0 python3.9[149891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764796700.206883-358-230066662419595/.source.json _original_basename=.4naw9mvp follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:21 compute-0 sudo[149889]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:18:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:18:22 compute-0 sudo[150041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jujwxfhoviglsifcgyllsrhvnrralxan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796701.718208-373-184492024803039/AnsiballZ_file.py'
Dec 03 21:18:22 compute-0 sudo[150041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:22 compute-0 python3.9[150043]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:22 compute-0 sudo[150041]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:22 compute-0 ceph-mon[75204]: pgmap v353: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:22 compute-0 sudo[150193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asihrrzshihrdsffqukggqktavwsfwgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796702.497068-381-143745738751352/AnsiballZ_stat.py'
Dec 03 21:18:22 compute-0 sudo[150193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:23 compute-0 sudo[150193]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v354: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:23 compute-0 sudo[150316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oibqfxdrkizkhgpexmahgovpqpvosymx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796702.497068-381-143745738751352/AnsiballZ_copy.py'
Dec 03 21:18:23 compute-0 sudo[150316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:23 compute-0 sudo[150316]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:24 compute-0 ceph-mon[75204]: pgmap v354: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:24 compute-0 sudo[150468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slbyxnokrxqskvzwvcupkrczwmxvmgti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796704.0567024-398-28851435908216/AnsiballZ_container_config_data.py'
Dec 03 21:18:24 compute-0 sudo[150468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:24 compute-0 python3.9[150470]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 03 21:18:24 compute-0 sudo[150468]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v355: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:25 compute-0 sudo[150620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgufivhzusxwainwthngmwkvvrhzubwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796705.1173725-407-34908127280146/AnsiballZ_container_config_hash.py'
Dec 03 21:18:25 compute-0 sudo[150620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:25 compute-0 python3.9[150622]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 03 21:18:25 compute-0 sudo[150620]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:26 compute-0 ceph-mon[75204]: pgmap v355: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:26 compute-0 sudo[150772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upuqlobjvxosugbcsaebrltdsjwxbeaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796706.14084-416-9689537313693/AnsiballZ_podman_container_info.py'
Dec 03 21:18:26 compute-0 sudo[150772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:26 compute-0 python3.9[150774]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 03 21:18:27 compute-0 sudo[150772]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v356: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:18:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:18:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:28 compute-0 sudo[150951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kknvxscazrpyelojwpkanbfquxfusnzi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764796707.7857246-429-59199555573610/AnsiballZ_edpm_container_manage.py'
Dec 03 21:18:28 compute-0 sudo[150951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:28 compute-0 ceph-mon[75204]: pgmap v356: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:28 compute-0 python3[150953]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 03 21:18:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v357: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:30 compute-0 ceph-mon[75204]: pgmap v357: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v358: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:32 compute-0 podman[151017]: 2025-12-03 21:18:32.228360818 +0000 UTC m=+1.169000155 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 21:18:32 compute-0 ceph-mon[75204]: pgmap v358: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v359: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:34 compute-0 ceph-mon[75204]: pgmap v359: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v360: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:35 compute-0 ceph-mon[75204]: pgmap v360: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v361: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v362: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:39 compute-0 ceph-mon[75204]: pgmap v361: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:39 compute-0 ceph-mon[75204]: pgmap v362: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:39 compute-0 podman[150967]: 2025-12-03 21:18:39.970323756 +0000 UTC m=+11.174990797 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 03 21:18:40 compute-0 podman[151112]: 2025-12-03 21:18:40.192712948 +0000 UTC m=+0.080856750 container create ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Dec 03 21:18:40 compute-0 podman[151112]: 2025-12-03 21:18:40.152724535 +0000 UTC m=+0.040868377 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 03 21:18:40 compute-0 python3[150953]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 03 21:18:40 compute-0 sudo[150951]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:40 compute-0 sudo[151302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anhpsnllkbknagaegzlezsteradtbrka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796720.607416-437-52181530646553/AnsiballZ_stat.py'
Dec 03 21:18:40 compute-0 sudo[151302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:41 compute-0 python3.9[151304]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:18:41 compute-0 sudo[151302]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v363: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:41 compute-0 ceph-mon[75204]: pgmap v363: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:41 compute-0 sudo[151456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijmkoqccfrfqagxwmgojxqqvewupiovb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796721.5079012-446-206475631290094/AnsiballZ_file.py'
Dec 03 21:18:41 compute-0 sudo[151456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:42 compute-0 sshd-session[151151]: Invalid user support from 91.202.233.33 port 21788
Dec 03 21:18:42 compute-0 python3.9[151458]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:42 compute-0 sudo[151456]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:42 compute-0 sudo[151532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hplcnxqeoruwzolyfkajgbqxnezegzzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796721.5079012-446-206475631290094/AnsiballZ_stat.py'
Dec 03 21:18:42 compute-0 sudo[151532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:42 compute-0 python3.9[151534]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:18:42 compute-0 sudo[151532]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:42 compute-0 sshd-session[151151]: Connection reset by invalid user support 91.202.233.33 port 21788 [preauth]
Dec 03 21:18:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v364: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:43 compute-0 sudo[151684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcoktqogtxbbbsynbfjhogildzakqiwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796722.7850075-446-184115650515974/AnsiballZ_copy.py'
Dec 03 21:18:43 compute-0 sudo[151684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:43 compute-0 python3.9[151686]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764796722.7850075-446-184115650515974/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:18:43 compute-0 sudo[151684]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:43 compute-0 sudo[151761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyxmctvigkufcjkbaefxnjagjszdsbqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796722.7850075-446-184115650515974/AnsiballZ_systemd.py'
Dec 03 21:18:43 compute-0 sudo[151761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:44 compute-0 python3.9[151763]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 21:18:44 compute-0 systemd[1]: Reloading.
Dec 03 21:18:44 compute-0 systemd-rc-local-generator[151787]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:18:44 compute-0 systemd-sysv-generator[151793]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:18:44 compute-0 ceph-mon[75204]: pgmap v364: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:44 compute-0 sudo[151761]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:44 compute-0 sshd-session[151611]: Invalid user admin from 91.202.233.33 port 21802
Dec 03 21:18:45 compute-0 sudo[151872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwcytbfxztqsznxfefnujgksxssslpjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796722.7850075-446-184115650515974/AnsiballZ_systemd.py'
Dec 03 21:18:45 compute-0 sudo[151872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v365: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:45 compute-0 python3.9[151874]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:18:45 compute-0 sshd-session[151611]: Connection reset by invalid user admin 91.202.233.33 port 21802 [preauth]
Dec 03 21:18:46 compute-0 systemd[1]: Reloading.
Dec 03 21:18:46 compute-0 ceph-mon[75204]: pgmap v365: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:46 compute-0 systemd-rc-local-generator[151904]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:18:46 compute-0 systemd-sysv-generator[151909]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:18:46 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Dec 03 21:18:46 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:18:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1af3e6be989aae80c7fa7a27b5aa50e0661b0544c6b5b89d7c1402b58f0905a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1af3e6be989aae80c7fa7a27b5aa50e0661b0544c6b5b89d7c1402b58f0905a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:46 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c.
Dec 03 21:18:46 compute-0 podman[151917]: 2025-12-03 21:18:46.996288428 +0000 UTC m=+0.180510544 container init ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + sudo -E kolla_set_configs
Dec 03 21:18:47 compute-0 podman[151917]: 2025-12-03 21:18:47.032562637 +0000 UTC m=+0.216784753 container start ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 03 21:18:47 compute-0 edpm-start-podman-container[151917]: ovn_metadata_agent
Dec 03 21:18:47 compute-0 podman[151938]: 2025-12-03 21:18:47.121160932 +0000 UTC m=+0.070492016 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 21:18:47 compute-0 edpm-start-podman-container[151916]: Creating additional drop-in dependency for "ovn_metadata_agent" (ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c)
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Validating config file
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Copying service configuration files
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Writing out command to execute
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: ++ cat /run_command
Dec 03 21:18:47 compute-0 systemd[1]: Reloading.
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + CMD=neutron-ovn-metadata-agent
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + ARGS=
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + sudo kolla_copy_cacerts
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + [[ ! -n '' ]]
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + . kolla_extend_start
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: Running command: 'neutron-ovn-metadata-agent'
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + umask 0022
Dec 03 21:18:47 compute-0 ovn_metadata_agent[151932]: + exec neutron-ovn-metadata-agent
Dec 03 21:18:47 compute-0 systemd-rc-local-generator[152006]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:18:47 compute-0 systemd-sysv-generator[152010]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:18:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v366: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:47 compute-0 systemd[1]: Started ovn_metadata_agent container.
Dec 03 21:18:47 compute-0 sudo[151872]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:47 compute-0 sshd-session[151876]: Connection reset by authenticating user root 91.202.233.33 port 21816 [preauth]
Dec 03 21:18:47 compute-0 sshd-session[142939]: Connection closed by 192.168.122.30 port 50544
Dec 03 21:18:47 compute-0 sshd-session[142936]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:18:47 compute-0 systemd[1]: session-47.scope: Deactivated successfully.
Dec 03 21:18:47 compute-0 systemd[1]: session-47.scope: Consumed 1min 1.553s CPU time.
Dec 03 21:18:47 compute-0 systemd-logind[787]: Session 47 logged out. Waiting for processes to exit.
Dec 03 21:18:47 compute-0 systemd-logind[787]: Removed session 47.
Dec 03 21:18:48 compute-0 ceph-mon[75204]: pgmap v366: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.885 151937 INFO neutron.common.config [-] Logging enabled!
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.885 151937 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.885 151937 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.886 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.886 151937 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.886 151937 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.886 151937 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.886 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.887 151937 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.888 151937 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.889 151937 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.890 151937 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.891 151937 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.892 151937 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.893 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.894 151937 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.895 151937 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.895 151937 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.895 151937 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.895 151937 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.895 151937 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.895 151937 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.895 151937 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.895 151937 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.896 151937 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.897 151937 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.897 151937 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.897 151937 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.897 151937 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.897 151937 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.897 151937 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.897 151937 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.897 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.898 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.898 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.898 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.898 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.898 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.898 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.898 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.899 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.899 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.899 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.899 151937 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.899 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.899 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.899 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.899 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.900 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.901 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.901 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.901 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.901 151937 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.901 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.901 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.901 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.901 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.902 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.903 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.904 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.905 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.906 151937 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.907 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.908 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.909 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.910 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.911 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.912 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.913 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.913 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.913 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.913 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.913 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.913 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.913 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.913 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.914 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.914 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.914 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.914 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.914 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.914 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.915 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.915 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.915 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.915 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.915 151937 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.915 151937 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.916 151937 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.916 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.916 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.916 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.916 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.916 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.917 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.917 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.917 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.917 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.917 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.917 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.918 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.918 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.918 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.918 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.918 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.918 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.918 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.918 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.919 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.919 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.919 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.919 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.919 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.919 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.919 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.920 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.921 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.921 151937 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.921 151937 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.930 151937 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.930 151937 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.930 151937 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.930 151937 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.930 151937 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.944 151937 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name f27c01e7-5b62-4209-a664-3ae50b74644d (UUID: f27c01e7-5b62-4209-a664-3ae50b74644d) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.965 151937 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.966 151937 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.966 151937 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.966 151937 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.968 151937 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.974 151937 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.978 151937 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'f27c01e7-5b62-4209-a664-3ae50b74644d'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fcf8e279af0>], external_ids={}, name=f27c01e7-5b62-4209-a664-3ae50b74644d, nb_cfg_timestamp=1764796659125, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.978 151937 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fcf8e27cb20>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.979 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.979 151937 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.979 151937 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.979 151937 INFO oslo_service.service [-] Starting 1 workers
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.983 151937 DEBUG oslo_service.service [-] Started child 152048 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.987 151937 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpy_tfhlj0/privsep.sock']
Dec 03 21:18:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:48.989 152048 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-429722'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.027 152048 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.028 152048 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.028 152048 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.033 152048 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.042 152048 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.051 152048 INFO eventlet.wsgi.server [-] (152048) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 03 21:18:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v367: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:49 compute-0 sshd-session[152046]: Invalid user user from 91.202.233.33 port 21830
Dec 03 21:18:49 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 03 21:18:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.667 151937 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.668 151937 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpy_tfhlj0/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.550 152053 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.553 152053 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.555 152053 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.555 152053 INFO oslo.privsep.daemon [-] privsep daemon running as pid 152053
Dec 03 21:18:49 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:49.670 152053 DEBUG oslo.privsep.daemon [-] privsep: reply[88d87757-c1da-4f20-b8da-8be9d4811535]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 03 21:18:49 compute-0 sshd-session[152046]: Connection reset by invalid user user 91.202.233.33 port 21830 [preauth]
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.136 152053 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.137 152053 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.137 152053 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:18:50 compute-0 ceph-mon[75204]: pgmap v367: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.637 152053 DEBUG oslo.privsep.daemon [-] privsep: reply[e3bb22a8-63ee-4972-8b04-f7010411a591]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.641 151937 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=f27c01e7-5b62-4209-a664-3ae50b74644d, column=external_ids, values=({'neutron:ovn-metadata-id': '50a49342-76af-5160-b4ff-e6b2680e1d47'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.656 151937 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f27c01e7-5b62-4209-a664-3ae50b74644d, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.665 151937 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.665 151937 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.665 151937 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.665 151937 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.666 151937 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.666 151937 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.666 151937 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.666 151937 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.667 151937 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.667 151937 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.667 151937 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.667 151937 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.668 151937 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.668 151937 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.668 151937 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.668 151937 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.669 151937 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.669 151937 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.669 151937 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.669 151937 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.670 151937 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.670 151937 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.670 151937 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.670 151937 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.671 151937 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.671 151937 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.671 151937 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.672 151937 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.672 151937 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.672 151937 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.672 151937 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.672 151937 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.673 151937 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.673 151937 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.673 151937 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.674 151937 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.674 151937 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.674 151937 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.674 151937 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.675 151937 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.675 151937 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.675 151937 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.675 151937 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.676 151937 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.676 151937 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.676 151937 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.676 151937 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.676 151937 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.677 151937 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.677 151937 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.677 151937 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.677 151937 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.678 151937 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.678 151937 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.678 151937 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.678 151937 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.678 151937 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.679 151937 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.679 151937 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.679 151937 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.679 151937 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.679 151937 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.680 151937 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.680 151937 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.680 151937 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.680 151937 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.681 151937 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.681 151937 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.681 151937 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.681 151937 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.681 151937 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.682 151937 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.682 151937 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.682 151937 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.682 151937 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.683 151937 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.683 151937 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.683 151937 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.683 151937 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.684 151937 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.684 151937 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.684 151937 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.684 151937 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.684 151937 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.685 151937 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.685 151937 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.685 151937 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.685 151937 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.686 151937 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.686 151937 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.686 151937 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.686 151937 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.686 151937 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.687 151937 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.687 151937 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.687 151937 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.687 151937 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.688 151937 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.688 151937 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.688 151937 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.688 151937 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.688 151937 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.689 151937 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.689 151937 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.689 151937 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.689 151937 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.689 151937 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.690 151937 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.690 151937 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.690 151937 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.691 151937 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.691 151937 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.691 151937 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.691 151937 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.691 151937 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.692 151937 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.692 151937 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.692 151937 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.692 151937 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.693 151937 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.693 151937 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.693 151937 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.693 151937 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.694 151937 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.694 151937 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.694 151937 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.694 151937 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.695 151937 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.695 151937 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.695 151937 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.695 151937 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.696 151937 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.696 151937 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.696 151937 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.696 151937 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.697 151937 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.697 151937 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.697 151937 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.697 151937 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.697 151937 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.698 151937 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.698 151937 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.698 151937 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.698 151937 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.699 151937 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.699 151937 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.699 151937 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.699 151937 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.699 151937 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.700 151937 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.700 151937 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.700 151937 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.700 151937 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.701 151937 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.701 151937 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.701 151937 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.701 151937 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.701 151937 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.702 151937 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.702 151937 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.702 151937 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.702 151937 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.702 151937 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.703 151937 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.703 151937 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.703 151937 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.703 151937 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.703 151937 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.704 151937 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.704 151937 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.704 151937 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.704 151937 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.705 151937 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.705 151937 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.705 151937 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.705 151937 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.705 151937 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.706 151937 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.706 151937 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.706 151937 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.706 151937 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.707 151937 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.707 151937 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.707 151937 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.707 151937 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.707 151937 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.708 151937 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.708 151937 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.708 151937 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.708 151937 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.708 151937 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.708 151937 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.708 151937 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.709 151937 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.709 151937 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.709 151937 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.709 151937 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.709 151937 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.709 151937 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.709 151937 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.709 151937 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.710 151937 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.710 151937 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.710 151937 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.710 151937 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.710 151937 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.710 151937 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.710 151937 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.711 151937 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.711 151937 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.711 151937 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.711 151937 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.711 151937 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.711 151937 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.711 151937 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.711 151937 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.712 151937 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.712 151937 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.712 151937 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.712 151937 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.712 151937 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.712 151937 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.712 151937 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.713 151937 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.713 151937 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.713 151937 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.713 151937 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.713 151937 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.713 151937 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.713 151937 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.713 151937 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.714 151937 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.714 151937 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.714 151937 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.714 151937 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.714 151937 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.714 151937 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.714 151937 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.715 151937 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.715 151937 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.715 151937 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.715 151937 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.715 151937 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.715 151937 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.715 151937 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.716 151937 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.716 151937 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.716 151937 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.716 151937 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.716 151937 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.716 151937 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.716 151937 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.717 151937 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.717 151937 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.717 151937 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.717 151937 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.717 151937 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.717 151937 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.717 151937 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.718 151937 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.718 151937 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.718 151937 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.718 151937 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.718 151937 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.718 151937 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.718 151937 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.718 151937 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.719 151937 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.719 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.719 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.719 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.719 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.719 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.719 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.720 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.720 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.720 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.720 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.720 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.720 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.720 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.721 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.721 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.721 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.721 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.721 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.721 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.721 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.722 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.722 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.722 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.722 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.722 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.722 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.722 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.723 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.723 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.723 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.723 151937 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.723 151937 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.723 151937 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.723 151937 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.724 151937 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:18:50 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:18:50.724 151937 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 03 21:18:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v368: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:18:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:18:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:18:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:18:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:18:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:18:52 compute-0 sshd-session[152058]: Connection reset by authenticating user root 91.202.233.33 port 35376 [preauth]
Dec 03 21:18:52 compute-0 ceph-mon[75204]: pgmap v368: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v369: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:53 compute-0 sudo[152060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:18:53 compute-0 sudo[152060]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:53 compute-0 sudo[152060]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:53 compute-0 sudo[152085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 check-host
Dec 03 21:18:53 compute-0 sudo[152085]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:53 compute-0 sshd-session[152112]: Accepted publickey for zuul from 192.168.122.30 port 45744 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:18:53 compute-0 systemd-logind[787]: New session 48 of user zuul.
Dec 03 21:18:53 compute-0 systemd[1]: Started Session 48 of User zuul.
Dec 03 21:18:53 compute-0 sshd-session[152112]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:18:53 compute-0 sudo[152085]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:18:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:18:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:18:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:18:53 compute-0 sudo[152134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:18:53 compute-0 sudo[152134]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:53 compute-0 sudo[152134]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:53 compute-0 sudo[152187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:18:53 compute-0 sudo[152187]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:54 compute-0 ceph-mon[75204]: pgmap v369: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:18:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:18:54 compute-0 sudo[152187]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:18:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:18:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:18:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:18:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:18:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:18:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:18:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:18:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:18:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:18:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:18:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:18:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:18:54 compute-0 sudo[152365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:18:54 compute-0 sudo[152365]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:54 compute-0 sudo[152365]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:54 compute-0 sudo[152390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:18:54 compute-0 sudo[152390]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:54 compute-0 python3.9[152364]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:18:55 compute-0 podman[152431]: 2025-12-03 21:18:55.086491382 +0000 UTC m=+0.038940159 container create 81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_mclaren, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:18:55 compute-0 systemd[1]: Started libpod-conmon-81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc.scope.
Dec 03 21:18:55 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:18:55 compute-0 podman[152431]: 2025-12-03 21:18:55.068200549 +0000 UTC m=+0.020649366 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:18:55 compute-0 podman[152431]: 2025-12-03 21:18:55.182394666 +0000 UTC m=+0.134843543 container init 81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_mclaren, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:18:55 compute-0 podman[152431]: 2025-12-03 21:18:55.190374852 +0000 UTC m=+0.142823659 container start 81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_mclaren, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:18:55 compute-0 podman[152431]: 2025-12-03 21:18:55.195410543 +0000 UTC m=+0.147859420 container attach 81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_mclaren, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:18:55 compute-0 magical_mclaren[152461]: 167 167
Dec 03 21:18:55 compute-0 systemd[1]: libpod-81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc.scope: Deactivated successfully.
Dec 03 21:18:55 compute-0 conmon[152461]: conmon 81eccddf2beb9b976cd1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc.scope/container/memory.events
Dec 03 21:18:55 compute-0 podman[152431]: 2025-12-03 21:18:55.201129301 +0000 UTC m=+0.153578098 container died 81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_mclaren, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:18:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-aa0f619a1051df77f218b259db3f5c0db0c3a8c998233f29d06187bc5b0682eb-merged.mount: Deactivated successfully.
Dec 03 21:18:55 compute-0 podman[152431]: 2025-12-03 21:18:55.273183097 +0000 UTC m=+0.225631904 container remove 81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_mclaren, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:18:55 compute-0 systemd[1]: libpod-conmon-81eccddf2beb9b976cd1390798ba0a02b976e63bda0fe2e9919e404d74cc33bc.scope: Deactivated successfully.
Dec 03 21:18:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v370: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:18:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:18:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:18:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:18:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:18:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:18:55 compute-0 podman[152527]: 2025-12-03 21:18:55.495364328 +0000 UTC m=+0.057967261 container create 8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_cartwright, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default)
Dec 03 21:18:55 compute-0 systemd[1]: Started libpod-conmon-8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c.scope.
Dec 03 21:18:55 compute-0 podman[152527]: 2025-12-03 21:18:55.467485956 +0000 UTC m=+0.030088949 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:18:55 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:18:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a877390519504be18c491692cfcf9f91a9674efbcc500ccc96a9e96b9c119395/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a877390519504be18c491692cfcf9f91a9674efbcc500ccc96a9e96b9c119395/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a877390519504be18c491692cfcf9f91a9674efbcc500ccc96a9e96b9c119395/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a877390519504be18c491692cfcf9f91a9674efbcc500ccc96a9e96b9c119395/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a877390519504be18c491692cfcf9f91a9674efbcc500ccc96a9e96b9c119395/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:55 compute-0 podman[152527]: 2025-12-03 21:18:55.589726941 +0000 UTC m=+0.152329894 container init 8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_cartwright, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:18:55 compute-0 podman[152527]: 2025-12-03 21:18:55.599826373 +0000 UTC m=+0.162429286 container start 8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_cartwright, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:18:55 compute-0 podman[152527]: 2025-12-03 21:18:55.603405186 +0000 UTC m=+0.166008099 container attach 8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_cartwright, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:18:55 compute-0 sudo[152644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgaeawjztsbrzgrawaqzcijpeqxknnsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796735.3735747-34-38715035119184/AnsiballZ_command.py'
Dec 03 21:18:55 compute-0 sudo[152644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:56 compute-0 objective_cartwright[152564]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:18:56 compute-0 objective_cartwright[152564]: --> All data devices are unavailable
Dec 03 21:18:56 compute-0 python3.9[152648]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:18:56 compute-0 systemd[1]: libpod-8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c.scope: Deactivated successfully.
Dec 03 21:18:56 compute-0 podman[152527]: 2025-12-03 21:18:56.137896154 +0000 UTC m=+0.700499067 container died 8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_cartwright, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:18:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-a877390519504be18c491692cfcf9f91a9674efbcc500ccc96a9e96b9c119395-merged.mount: Deactivated successfully.
Dec 03 21:18:56 compute-0 podman[152527]: 2025-12-03 21:18:56.173636448 +0000 UTC m=+0.736239361 container remove 8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_cartwright, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:18:56 compute-0 systemd[1]: libpod-conmon-8144338f30fe8fc144bbe15c2f24c0fc33f13e7631104cdcc9b73d81089e450c.scope: Deactivated successfully.
Dec 03 21:18:56 compute-0 sudo[152644]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:56 compute-0 sudo[152390]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:56 compute-0 sudo[152685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:18:56 compute-0 sudo[152685]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:56 compute-0 sudo[152685]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:56 compute-0 sudo[152734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:18:56 compute-0 sudo[152734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:56 compute-0 ceph-mon[75204]: pgmap v370: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:56 compute-0 podman[152792]: 2025-12-03 21:18:56.720039636 +0000 UTC m=+0.048917108 container create edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_yonath, org.label-schema.license=GPLv2, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:18:56 compute-0 systemd[1]: Started libpod-conmon-edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46.scope.
Dec 03 21:18:56 compute-0 podman[152792]: 2025-12-03 21:18:56.699199526 +0000 UTC m=+0.028077038 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:18:56 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:18:56 compute-0 podman[152792]: 2025-12-03 21:18:56.816201465 +0000 UTC m=+0.145078937 container init edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_yonath, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:18:56 compute-0 podman[152792]: 2025-12-03 21:18:56.824010567 +0000 UTC m=+0.152888039 container start edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_yonath, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True)
Dec 03 21:18:56 compute-0 podman[152792]: 2025-12-03 21:18:56.827422305 +0000 UTC m=+0.156299777 container attach edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_yonath, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030)
Dec 03 21:18:56 compute-0 relaxed_yonath[152828]: 167 167
Dec 03 21:18:56 compute-0 podman[152792]: 2025-12-03 21:18:56.830079524 +0000 UTC m=+0.158956996 container died edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_yonath, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:18:56 compute-0 systemd[1]: libpod-edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46.scope: Deactivated successfully.
Dec 03 21:18:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-db412f71611247c22ef5cb00af7a723e21d90c73b551a069249abc0cfeafc806-merged.mount: Deactivated successfully.
Dec 03 21:18:56 compute-0 podman[152792]: 2025-12-03 21:18:56.871643551 +0000 UTC m=+0.200521023 container remove edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_yonath, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:18:56 compute-0 systemd[1]: libpod-conmon-edaaeea4f3309de9236a7ef77a7beae511e180bf0c4c4743f150f5b98edd1f46.scope: Deactivated successfully.
Dec 03 21:18:57 compute-0 podman[152863]: 2025-12-03 21:18:57.087849348 +0000 UTC m=+0.059451740 container create b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_almeida, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:18:57 compute-0 systemd[1]: Started libpod-conmon-b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce.scope.
Dec 03 21:18:57 compute-0 podman[152863]: 2025-12-03 21:18:57.06669901 +0000 UTC m=+0.038301412 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:18:57 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:18:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df5f8525e5015a1dd0dd5bff8012fd7b1ff10277f99fa277610ad80f947c5c1e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df5f8525e5015a1dd0dd5bff8012fd7b1ff10277f99fa277610ad80f947c5c1e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df5f8525e5015a1dd0dd5bff8012fd7b1ff10277f99fa277610ad80f947c5c1e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df5f8525e5015a1dd0dd5bff8012fd7b1ff10277f99fa277610ad80f947c5c1e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:57 compute-0 podman[152863]: 2025-12-03 21:18:57.188252927 +0000 UTC m=+0.159855379 container init b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_almeida, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:18:57 compute-0 podman[152863]: 2025-12-03 21:18:57.201537621 +0000 UTC m=+0.173139983 container start b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_almeida, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:18:57 compute-0 podman[152863]: 2025-12-03 21:18:57.205185806 +0000 UTC m=+0.176788248 container attach b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_almeida, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:18:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v371: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:57 compute-0 sudo[152957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygrtmintliezjyqbbmybvakmucmjqaki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796736.6260376-45-131414765637812/AnsiballZ_systemd_service.py'
Dec 03 21:18:57 compute-0 sudo[152957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:18:57 compute-0 interesting_almeida[152879]: {
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:     "0": [
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:         {
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "devices": [
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "/dev/loop3"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             ],
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_name": "ceph_lv0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_size": "21470642176",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "name": "ceph_lv0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "tags": {
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cluster_name": "ceph",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.crush_device_class": "",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.encrypted": "0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.objectstore": "bluestore",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osd_id": "0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.type": "block",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.vdo": "0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.with_tpm": "0"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             },
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "type": "block",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "vg_name": "ceph_vg0"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:         }
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:     ],
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:     "1": [
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:         {
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "devices": [
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "/dev/loop4"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             ],
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_name": "ceph_lv1",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_size": "21470642176",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "name": "ceph_lv1",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "tags": {
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cluster_name": "ceph",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.crush_device_class": "",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.encrypted": "0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.objectstore": "bluestore",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osd_id": "1",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.type": "block",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.vdo": "0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.with_tpm": "0"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             },
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "type": "block",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "vg_name": "ceph_vg1"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:         }
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:     ],
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:     "2": [
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:         {
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "devices": [
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "/dev/loop5"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             ],
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_name": "ceph_lv2",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_size": "21470642176",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "name": "ceph_lv2",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "tags": {
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.cluster_name": "ceph",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.crush_device_class": "",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.encrypted": "0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.objectstore": "bluestore",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osd_id": "2",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.type": "block",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.vdo": "0",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:                 "ceph.with_tpm": "0"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             },
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "type": "block",
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:             "vg_name": "ceph_vg2"
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:         }
Dec 03 21:18:57 compute-0 interesting_almeida[152879]:     ]
Dec 03 21:18:57 compute-0 interesting_almeida[152879]: }
Dec 03 21:18:57 compute-0 systemd[1]: libpod-b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce.scope: Deactivated successfully.
Dec 03 21:18:57 compute-0 podman[152863]: 2025-12-03 21:18:57.620078387 +0000 UTC m=+0.591680749 container died b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_almeida, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:18:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-df5f8525e5015a1dd0dd5bff8012fd7b1ff10277f99fa277610ad80f947c5c1e-merged.mount: Deactivated successfully.
Dec 03 21:18:57 compute-0 podman[152863]: 2025-12-03 21:18:57.682162374 +0000 UTC m=+0.653764736 container remove b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_almeida, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:18:57 compute-0 systemd[1]: libpod-conmon-b0b2866adbd1213bfeac11d3817b4f29296e2a90ab8dfbf64aa3e135e909f5ce.scope: Deactivated successfully.
Dec 03 21:18:57 compute-0 sudo[152734]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:57 compute-0 python3.9[152961]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 21:18:57 compute-0 systemd[1]: Reloading.
Dec 03 21:18:57 compute-0 systemd-rc-local-generator[153026]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:18:57 compute-0 systemd-sysv-generator[153030]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:18:58 compute-0 sudo[152975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:18:58 compute-0 sudo[152975]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:58 compute-0 sudo[152975]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:58 compute-0 sudo[152957]: pam_unix(sudo:session): session closed for user root
Dec 03 21:18:58 compute-0 sudo[153035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:18:58 compute-0 sudo[153035]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:18:58 compute-0 podman[153148]: 2025-12-03 21:18:58.427204113 +0000 UTC m=+0.067557549 container create 50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_dijkstra, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:18:58 compute-0 systemd[1]: Started libpod-conmon-50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29.scope.
Dec 03 21:18:58 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:18:58 compute-0 podman[153148]: 2025-12-03 21:18:58.402416582 +0000 UTC m=+0.042770068 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:18:58 compute-0 podman[153148]: 2025-12-03 21:18:58.505907071 +0000 UTC m=+0.146260557 container init 50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_dijkstra, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:18:58 compute-0 podman[153148]: 2025-12-03 21:18:58.513940049 +0000 UTC m=+0.154293525 container start 50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_dijkstra, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:18:58 compute-0 podman[153148]: 2025-12-03 21:18:58.518084046 +0000 UTC m=+0.158437522 container attach 50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_dijkstra, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:18:58 compute-0 busy_dijkstra[153164]: 167 167
Dec 03 21:18:58 compute-0 systemd[1]: libpod-50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29.scope: Deactivated successfully.
Dec 03 21:18:58 compute-0 podman[153148]: 2025-12-03 21:18:58.522301176 +0000 UTC m=+0.162654642 container died 50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_dijkstra, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:18:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-59949f01963d83f19911d0e8b7c0899340aedaa2a77b0e44414801535a1f0899-merged.mount: Deactivated successfully.
Dec 03 21:18:58 compute-0 podman[153148]: 2025-12-03 21:18:58.573131111 +0000 UTC m=+0.213484587 container remove 50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_dijkstra, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:18:58 compute-0 systemd[1]: libpod-conmon-50d50d817d9bf90f2c101cc83a27e253e77f8e6e27478612bde8546953cb9a29.scope: Deactivated successfully.
Dec 03 21:18:58 compute-0 ceph-mon[75204]: pgmap v371: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:58 compute-0 podman[153261]: 2025-12-03 21:18:58.820107545 +0000 UTC m=+0.074243032 container create 6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_lumiere, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:18:58 compute-0 systemd[1]: Started libpod-conmon-6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666.scope.
Dec 03 21:18:58 compute-0 podman[153261]: 2025-12-03 21:18:58.790651153 +0000 UTC m=+0.044786690 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:18:58 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:18:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d1882832d911e92556a8534d548c3baa2b31f9dd3150afe8e8f01eeeec73072/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d1882832d911e92556a8534d548c3baa2b31f9dd3150afe8e8f01eeeec73072/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d1882832d911e92556a8534d548c3baa2b31f9dd3150afe8e8f01eeeec73072/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d1882832d911e92556a8534d548c3baa2b31f9dd3150afe8e8f01eeeec73072/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:18:58 compute-0 python3.9[153255]: ansible-ansible.builtin.service_facts Invoked
Dec 03 21:18:58 compute-0 podman[153261]: 2025-12-03 21:18:58.919328265 +0000 UTC m=+0.173463812 container init 6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_lumiere, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:18:58 compute-0 podman[153261]: 2025-12-03 21:18:58.929761374 +0000 UTC m=+0.183896861 container start 6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_lumiere, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:18:58 compute-0 network[153299]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:18:58 compute-0 network[153300]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:18:58 compute-0 network[153301]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:18:59 compute-0 podman[153261]: 2025-12-03 21:18:59.169136312 +0000 UTC m=+0.423271869 container attach 6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_lumiere, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:18:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v372: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:18:59 compute-0 lvm[153385]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:18:59 compute-0 lvm[153385]: VG ceph_vg2 finished
Dec 03 21:18:59 compute-0 lvm[153384]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:18:59 compute-0 lvm[153384]: VG ceph_vg1 finished
Dec 03 21:18:59 compute-0 lvm[153381]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:18:59 compute-0 lvm[153381]: VG ceph_vg0 finished
Dec 03 21:19:00 compute-0 romantic_lumiere[153278]: {}
Dec 03 21:19:00 compute-0 podman[153261]: 2025-12-03 21:19:00.091761908 +0000 UTC m=+1.345897465 container died 6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_lumiere, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:19:00 compute-0 systemd[1]: libpod-6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666.scope: Deactivated successfully.
Dec 03 21:19:00 compute-0 systemd[1]: libpod-6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666.scope: Consumed 1.354s CPU time.
Dec 03 21:19:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:00 compute-0 ceph-mon[75204]: pgmap v372: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-7d1882832d911e92556a8534d548c3baa2b31f9dd3150afe8e8f01eeeec73072-merged.mount: Deactivated successfully.
Dec 03 21:19:00 compute-0 podman[153261]: 2025-12-03 21:19:00.401881238 +0000 UTC m=+1.656016705 container remove 6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_lumiere, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:19:00 compute-0 systemd[1]: libpod-conmon-6d746c0d9901f36515318597abd149681402966f5c7a93b5d51f63757e229666.scope: Deactivated successfully.
Dec 03 21:19:00 compute-0 sudo[153035]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:19:00 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:19:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:19:00 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:19:00 compute-0 sudo[153432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:19:00 compute-0 sudo[153432]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:19:00 compute-0 sudo[153432]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v373: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:19:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:19:02 compute-0 ceph-mon[75204]: pgmap v373: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:03 compute-0 sudo[153679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgntzcmyvaapdzqkuhixbozlsxpcnizu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796742.7831573-64-262996501035784/AnsiballZ_systemd_service.py'
Dec 03 21:19:03 compute-0 sudo[153679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v374: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:03 compute-0 python3.9[153681]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:19:03 compute-0 sudo[153679]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:03 compute-0 sudo[153832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojvvulkafrgrdkxtgodbneuaezzccggs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796743.6308258-64-24223187877279/AnsiballZ_systemd_service.py'
Dec 03 21:19:03 compute-0 sudo[153832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:04 compute-0 python3.9[153834]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:19:04 compute-0 sudo[153832]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:04 compute-0 ceph-mon[75204]: pgmap v374: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:04 compute-0 sudo[153985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljicalwdkabrbahtjvwrjunifbwrtnvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796744.438669-64-217888747375253/AnsiballZ_systemd_service.py'
Dec 03 21:19:04 compute-0 sudo[153985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:05 compute-0 python3.9[153987]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:19:05 compute-0 sudo[153985]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v375: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:05 compute-0 sudo[154138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaljdyphpainxgzqdbshnasjrwuyuqsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796745.3712301-64-182763116932244/AnsiballZ_systemd_service.py'
Dec 03 21:19:05 compute-0 sudo[154138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:05 compute-0 ceph-mon[75204]: pgmap v375: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:06 compute-0 python3.9[154140]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:19:06 compute-0 sudo[154138]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:06 compute-0 podman[154141]: 2025-12-03 21:19:06.190653797 +0000 UTC m=+0.121008573 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Dec 03 21:19:06 compute-0 sudo[154318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsfrvkyyjqvxmrgxicmmwxuspskytzwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796746.3075776-64-262980349692225/AnsiballZ_systemd_service.py'
Dec 03 21:19:06 compute-0 sudo[154318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:07 compute-0 python3.9[154320]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:19:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v376: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:08 compute-0 sudo[154318]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:08 compute-0 ceph-mon[75204]: pgmap v376: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:08 compute-0 sudo[154471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyironxqepwymjmlfunikrconjqeorgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796748.3060195-64-40238379125127/AnsiballZ_systemd_service.py'
Dec 03 21:19:08 compute-0 sudo[154471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:09 compute-0 python3.9[154473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:19:09 compute-0 sudo[154471]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v377: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:09 compute-0 sudo[154624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqfurskuaokvioltodropjedkikyegfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796749.2458131-64-262868921314878/AnsiballZ_systemd_service.py'
Dec 03 21:19:09 compute-0 sudo[154624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:09 compute-0 python3.9[154626]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:19:09 compute-0 sudo[154624]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:19:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 16.41 MB, 0.03 MB/s
                                           Interval WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:19:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:10 compute-0 ceph-mon[75204]: pgmap v377: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:10 compute-0 sudo[154777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyrlvdydrzcjdtacnhevqshmhxkbvdhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796750.277784-116-181208815091096/AnsiballZ_file.py'
Dec 03 21:19:10 compute-0 sudo[154777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:11 compute-0 python3.9[154779]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:11 compute-0 sudo[154777]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v378: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:11 compute-0 sudo[154929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itpbgqmiworpyrztqvuhklwgkfcgznaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796751.3030221-116-140631773616162/AnsiballZ_file.py'
Dec 03 21:19:11 compute-0 sudo[154929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:11 compute-0 python3.9[154931]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:11 compute-0 sudo[154929]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:12 compute-0 ceph-mon[75204]: pgmap v378: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:12 compute-0 sudo[155081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fumgfemsrcaudgtferxgafxmvoibgsdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796752.1602402-116-87916589290074/AnsiballZ_file.py'
Dec 03 21:19:12 compute-0 sudo[155081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:12 compute-0 python3.9[155083]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:12 compute-0 sudo[155081]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:13 compute-0 sudo[155233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbufeyiisvtedvqeucpdsdgrsdcghnxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796752.9062579-116-222550788281453/AnsiballZ_file.py'
Dec 03 21:19:13 compute-0 sudo[155233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v379: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:13 compute-0 python3.9[155235]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:13 compute-0 sudo[155233]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:13 compute-0 sudo[155385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnpeyjyclbdokymxjvkvjyvxicooatch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796753.6514924-116-93750735912555/AnsiballZ_file.py'
Dec 03 21:19:13 compute-0 sudo[155385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:14 compute-0 python3.9[155387]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:14 compute-0 sudo[155385]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:19:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 16.57 MB, 0.03 MB/s
                                           Interval WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:19:14 compute-0 ceph-mon[75204]: pgmap v379: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:14 compute-0 sudo[155537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujeefnzrgfyfqopaozzpqujgyevfnkiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796754.375892-116-151281101540481/AnsiballZ_file.py'
Dec 03 21:19:14 compute-0 sudo[155537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:14 compute-0 python3.9[155539]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:14 compute-0 sudo[155537]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v380: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:15 compute-0 sudo[155689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbbtcndwivrimqnfwopnsyixlgvhgoeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796755.045991-116-34420558740208/AnsiballZ_file.py'
Dec 03 21:19:15 compute-0 sudo[155689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:15 compute-0 python3.9[155691]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:15 compute-0 sudo[155689]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:16 compute-0 sudo[155841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvzelilxwarsbnkhtavxnvgowdbezqgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796755.8327515-166-215938831026489/AnsiballZ_file.py'
Dec 03 21:19:16 compute-0 sudo[155841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:16 compute-0 python3.9[155843]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:16 compute-0 sudo[155841]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:16 compute-0 ceph-mon[75204]: pgmap v380: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:17 compute-0 sudo[155993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxhfeeyhwkkfcsgfxuatxhaxpsreewna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796756.6266909-166-44750921824124/AnsiballZ_file.py'
Dec 03 21:19:17 compute-0 sudo[155993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v381: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:17 compute-0 python3.9[155995]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:17 compute-0 sudo[155993]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:18 compute-0 sudo[156165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjaksoybmwauncogsjexbfusqvwugygs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796757.6590877-166-180898971262983/AnsiballZ_file.py'
Dec 03 21:19:18 compute-0 sudo[156165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:18 compute-0 podman[156119]: 2025-12-03 21:19:18.030645336 +0000 UTC m=+0.089095018 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:19:18 compute-0 python3.9[156167]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:18 compute-0 sudo[156165]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:18 compute-0 ceph-mon[75204]: pgmap v381: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:18 compute-0 sudo[156318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhcadnsxoxaaynttggcstfgjxydohvqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796758.3747344-166-199103920700896/AnsiballZ_file.py'
Dec 03 21:19:18 compute-0 sudo[156318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:18 compute-0 python3.9[156320]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:18 compute-0 sudo[156318]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:19:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 16.19 MB, 0.03 MB/s
                                           Interval WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:19:19 compute-0 sudo[156470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxdhknunuegqyqyfpsqgmphyxmhabulh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796759.0475872-166-167543962420383/AnsiballZ_file.py'
Dec 03 21:19:19 compute-0 sudo[156470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v382: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:19 compute-0 python3.9[156472]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:19 compute-0 sudo[156470]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:20 compute-0 sudo[156622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-megbqhfwcptwbnqtqwhdimxhfcndqpuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796759.7417648-166-17228222702107/AnsiballZ_file.py'
Dec 03 21:19:20 compute-0 sudo[156622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:20 compute-0 python3.9[156624]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:20 compute-0 sudo[156622]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:20 compute-0 ceph-mon[75204]: pgmap v382: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:20 compute-0 sudo[156774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnjlznzalwuimjwhugylictpvfmfrfyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796760.566103-166-78599881171419/AnsiballZ_file.py'
Dec 03 21:19:20 compute-0 sudo[156774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:21 compute-0 python3.9[156776]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:19:21
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['volumes', 'backups', '.mgr', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'images', 'vms']
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:19:21 compute-0 sudo[156774]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [devicehealth INFO root] Check health
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v383: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:19:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:19:21 compute-0 sudo[156926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqtrxytzzajeyrnnlsejfnqagpkqrrds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796761.5483716-217-274085215040103/AnsiballZ_command.py'
Dec 03 21:19:21 compute-0 sudo[156926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:22 compute-0 python3.9[156928]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:19:22 compute-0 sudo[156926]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:22 compute-0 ceph-mon[75204]: pgmap v383: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:23 compute-0 python3.9[157081]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 03 21:19:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v384: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:23 compute-0 sudo[157231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guwjysjtfqdvuleqrvaxyztkyxwoqpmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796763.4673903-235-210826218904151/AnsiballZ_systemd_service.py'
Dec 03 21:19:23 compute-0 sudo[157231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:24 compute-0 python3.9[157233]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 21:19:24 compute-0 systemd[1]: Reloading.
Dec 03 21:19:24 compute-0 systemd-sysv-generator[157262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:19:24 compute-0 systemd-rc-local-generator[157259]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:19:24 compute-0 sudo[157231]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:24 compute-0 ceph-mon[75204]: pgmap v384: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:25 compute-0 sudo[157418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlszzsnktteercvetprgepdfunviqqkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796764.7306154-243-123796976776604/AnsiballZ_command.py'
Dec 03 21:19:25 compute-0 sudo[157418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:25 compute-0 python3.9[157420]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:19:25 compute-0 sudo[157418]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v385: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:25 compute-0 sudo[157571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzcxdmokwqvzgoqrnfbhicsljimtombu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796765.537679-243-132780951253227/AnsiballZ_command.py'
Dec 03 21:19:25 compute-0 sudo[157571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:25 compute-0 python3.9[157573]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:19:26 compute-0 sudo[157571]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:26 compute-0 sudo[157724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehedgxaenxysbikzvpgurigfnjtvfcny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796766.1991117-243-22396127589820/AnsiballZ_command.py'
Dec 03 21:19:26 compute-0 sudo[157724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:26 compute-0 ceph-mon[75204]: pgmap v385: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:26 compute-0 python3.9[157726]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:19:26 compute-0 sudo[157724]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:27 compute-0 sudo[157877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmcavimshrxqoqhhitfvovqeahcpdjlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796766.9424057-243-250165559173284/AnsiballZ_command.py'
Dec 03 21:19:27 compute-0 sudo[157877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v386: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:27 compute-0 python3.9[157879]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:19:27 compute-0 sudo[157877]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:19:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:19:28 compute-0 sudo[158030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xytgodzvhmzepkmcyatafoxyvnuqdcld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796767.707891-243-228170225734664/AnsiballZ_command.py'
Dec 03 21:19:28 compute-0 sudo[158030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:28 compute-0 python3.9[158032]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:19:28 compute-0 sudo[158030]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:28 compute-0 ceph-mon[75204]: pgmap v386: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:28 compute-0 sudo[158183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wynpbeupdatimxykwkvfdeanwbahozzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796768.3938956-243-213500046108618/AnsiballZ_command.py'
Dec 03 21:19:28 compute-0 sudo[158183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:28 compute-0 python3.9[158185]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:19:29 compute-0 sudo[158183]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v387: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:29 compute-0 sudo[158336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtxxokzbqwkwddfjtfabkjrpnpbhreda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796769.1855106-243-273756021766516/AnsiballZ_command.py'
Dec 03 21:19:29 compute-0 sudo[158336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:29 compute-0 python3.9[158338]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:19:29 compute-0 sudo[158336]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:30 compute-0 ceph-mon[75204]: pgmap v387: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:30 compute-0 sudo[158489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlzjukooshqutwpsylhjlfgalxitlvxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796770.2536802-297-188685696517177/AnsiballZ_getent.py'
Dec 03 21:19:30 compute-0 sudo[158489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:31 compute-0 python3.9[158491]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 03 21:19:31 compute-0 sudo[158489]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v388: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:31 compute-0 sudo[158642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzfejoxhajwrokcwbtjwcxmoonirikdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796771.243117-305-138327677829305/AnsiballZ_group.py'
Dec 03 21:19:31 compute-0 sudo[158642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:31 compute-0 python3.9[158644]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 03 21:19:31 compute-0 groupadd[158645]: group added to /etc/group: name=libvirt, GID=42473
Dec 03 21:19:31 compute-0 groupadd[158645]: group added to /etc/gshadow: name=libvirt
Dec 03 21:19:32 compute-0 groupadd[158645]: new group: name=libvirt, GID=42473
Dec 03 21:19:32 compute-0 sudo[158642]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:32 compute-0 ceph-mon[75204]: pgmap v388: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:32 compute-0 sudo[158800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrbngloenzesledtkitsclvsmuwasffq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796772.2871504-313-159499124935088/AnsiballZ_user.py'
Dec 03 21:19:32 compute-0 sudo[158800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:33 compute-0 python3.9[158802]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 03 21:19:33 compute-0 useradd[158804]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 03 21:19:33 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:19:33 compute-0 sudo[158800]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v389: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:34 compute-0 sudo[158961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcylqrzrpmecfuscnkkgcbeluemeyfzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796773.6588092-324-56245943974655/AnsiballZ_setup.py'
Dec 03 21:19:34 compute-0 sudo[158961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:34 compute-0 python3.9[158963]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:19:34 compute-0 ceph-mon[75204]: pgmap v389: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:34 compute-0 sudo[158961]: pam_unix(sudo:session): session closed for user root
Dec 03 21:19:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:35 compute-0 sudo[159045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adheslshyegvsrbhvjtfuvzycyjwbgiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796773.6588092-324-56245943974655/AnsiballZ_dnf.py'
Dec 03 21:19:35 compute-0 sudo[159045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:19:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v390: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:35 compute-0 python3.9[159047]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:19:35 compute-0 ceph-mon[75204]: pgmap v390: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:37 compute-0 podman[159051]: 2025-12-03 21:19:37.194979488 +0000 UTC m=+0.125993563 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 21:19:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v391: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:38 compute-0 ceph-mon[75204]: pgmap v391: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v392: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:40 compute-0 ceph-mon[75204]: pgmap v392: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v393: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:42 compute-0 ceph-mon[75204]: pgmap v393: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v394: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:44 compute-0 ceph-mon[75204]: pgmap v394: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v395: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:46 compute-0 ceph-mon[75204]: pgmap v395: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v396: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:48 compute-0 ceph-mon[75204]: pgmap v396: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:19:48.923 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:19:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:19:48.926 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:19:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:19:48.926 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:19:49 compute-0 podman[159221]: 2025-12-03 21:19:49.164766433 +0000 UTC m=+0.084513010 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 03 21:19:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v397: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:50 compute-0 ceph-mon[75204]: pgmap v397: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v398: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:19:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:19:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:19:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:19:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:19:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:19:52 compute-0 ceph-mon[75204]: pgmap v398: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v399: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:54 compute-0 ceph-mon[75204]: pgmap v399: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:19:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v400: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:56 compute-0 ceph-mon[75204]: pgmap v400: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v401: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #24. Immutable memtables: 0.
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.592216) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 24
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796797592299, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1495, "num_deletes": 251, "total_data_size": 1662403, "memory_usage": 1691344, "flush_reason": "Manual Compaction"}
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #25: started
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796797604185, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 25, "file_size": 1620371, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7557, "largest_seqno": 9051, "table_properties": {"data_size": 1613487, "index_size": 4023, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13561, "raw_average_key_size": 18, "raw_value_size": 1599650, "raw_average_value_size": 2240, "num_data_blocks": 189, "num_entries": 714, "num_filter_entries": 714, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796633, "oldest_key_time": 1764796633, "file_creation_time": 1764796797, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 25, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 12008 microseconds, and 4253 cpu microseconds.
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.604235) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #25: 1620371 bytes OK
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.604256) [db/memtable_list.cc:519] [default] Level-0 commit table #25 started
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.605110) [db/memtable_list.cc:722] [default] Level-0 commit table #25: memtable #1 done
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.605124) EVENT_LOG_v1 {"time_micros": 1764796797605120, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.605146) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1655887, prev total WAL file size 1655887, number of live WAL files 2.
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000021.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.605886) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [25(1582KB)], [23(4392KB)]
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796797605944, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [25], "files_L6": [23], "score": -1, "input_data_size": 6117825, "oldest_snapshot_seqno": -1}
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #26: 2803 keys, 4865008 bytes, temperature: kUnknown
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796797654345, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 26, "file_size": 4865008, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4843590, "index_size": 13309, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7045, "raw_key_size": 65065, "raw_average_key_size": 23, "raw_value_size": 4790654, "raw_average_value_size": 1709, "num_data_blocks": 596, "num_entries": 2803, "num_filter_entries": 2803, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764796797, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.654878) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 4865008 bytes
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.656389) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 125.9 rd, 100.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 4.3 +0.0 blob) out(4.6 +0.0 blob), read-write-amplify(6.8) write-amplify(3.0) OK, records in: 3317, records dropped: 514 output_compression: NoCompression
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.656420) EVENT_LOG_v1 {"time_micros": 1764796797656405, "job": 8, "event": "compaction_finished", "compaction_time_micros": 48584, "compaction_time_cpu_micros": 22794, "output_level": 6, "num_output_files": 1, "total_output_size": 4865008, "num_input_records": 3317, "num_output_records": 2803, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000025.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796797657453, "job": 8, "event": "table_file_deletion", "file_number": 25}
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764796797659230, "job": 8, "event": "table_file_deletion", "file_number": 23}
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.605781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.659457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.659465) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.659470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.659474) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:19:57 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:19:57.659478) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:19:58 compute-0 ceph-mon[75204]: pgmap v401: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:19:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v402: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:00 compute-0 ceph-mon[75204]: pgmap v402: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:00 compute-0 sudo[159286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:20:00 compute-0 sudo[159286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:00 compute-0 sudo[159286]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:00 compute-0 sudo[159311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:20:00 compute-0 sudo[159311]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:01 compute-0 sudo[159311]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v403: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0)
Dec 03 21:20:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:20:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:20:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:20:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:20:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:20:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:20:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:20:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:20:01 compute-0 sudo[159365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:20:01 compute-0 sudo[159365]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:01 compute-0 sudo[159365]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:01 compute-0 sudo[159390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:20:01 compute-0 sudo[159390]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:20:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:20:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:20:01 compute-0 podman[159426]: 2025-12-03 21:20:01.918799241 +0000 UTC m=+0.068026129 container create 63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_shirley, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Dec 03 21:20:01 compute-0 systemd[1]: Started libpod-conmon-63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9.scope.
Dec 03 21:20:01 compute-0 podman[159426]: 2025-12-03 21:20:01.893512755 +0000 UTC m=+0.042739723 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:20:02 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:20:02 compute-0 podman[159426]: 2025-12-03 21:20:02.045441737 +0000 UTC m=+0.194668715 container init 63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_shirley, org.label-schema.build-date=20251030, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:20:02 compute-0 podman[159426]: 2025-12-03 21:20:02.059436251 +0000 UTC m=+0.208663149 container start 63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_shirley, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:20:02 compute-0 podman[159426]: 2025-12-03 21:20:02.063249804 +0000 UTC m=+0.212476782 container attach 63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_shirley, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:20:02 compute-0 interesting_shirley[159442]: 167 167
Dec 03 21:20:02 compute-0 systemd[1]: libpod-63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9.scope: Deactivated successfully.
Dec 03 21:20:02 compute-0 podman[159426]: 2025-12-03 21:20:02.072884341 +0000 UTC m=+0.222111229 container died 63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_shirley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:20:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-5076c30a60eb46bd1412f7d9e9d0abf6774ef4080deade970ab78fc2ef7530a9-merged.mount: Deactivated successfully.
Dec 03 21:20:02 compute-0 podman[159426]: 2025-12-03 21:20:02.131360135 +0000 UTC m=+0.280587053 container remove 63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_shirley, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:20:02 compute-0 systemd[1]: libpod-conmon-63a0579851b37bf5208137dda26a18239520ac565dfb402e4a545e6d325d0aa9.scope: Deactivated successfully.
Dec 03 21:20:02 compute-0 podman[159465]: 2025-12-03 21:20:02.329616006 +0000 UTC m=+0.055333011 container create be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_euler, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:20:02 compute-0 systemd[1]: Started libpod-conmon-be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890.scope.
Dec 03 21:20:02 compute-0 podman[159465]: 2025-12-03 21:20:02.303416976 +0000 UTC m=+0.029133991 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:20:02 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:20:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9038a53f4f5b82a3517ec716b8436d10c1c630d453e4b77bd4af332b992548/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9038a53f4f5b82a3517ec716b8436d10c1c630d453e4b77bd4af332b992548/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9038a53f4f5b82a3517ec716b8436d10c1c630d453e4b77bd4af332b992548/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9038a53f4f5b82a3517ec716b8436d10c1c630d453e4b77bd4af332b992548/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c9038a53f4f5b82a3517ec716b8436d10c1c630d453e4b77bd4af332b992548/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:02 compute-0 podman[159465]: 2025-12-03 21:20:02.448821493 +0000 UTC m=+0.174538548 container init be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_euler, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:20:02 compute-0 podman[159465]: 2025-12-03 21:20:02.461593804 +0000 UTC m=+0.187310799 container start be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_euler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:20:02 compute-0 podman[159465]: 2025-12-03 21:20:02.465367315 +0000 UTC m=+0.191084380 container attach be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_euler, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:20:02 compute-0 ceph-mon[75204]: pgmap v403: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:03 compute-0 busy_euler[159481]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:20:03 compute-0 busy_euler[159481]: --> All data devices are unavailable
Dec 03 21:20:03 compute-0 systemd[1]: libpod-be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890.scope: Deactivated successfully.
Dec 03 21:20:03 compute-0 podman[159465]: 2025-12-03 21:20:03.041331015 +0000 UTC m=+0.767048030 container died be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_euler, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:20:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-6c9038a53f4f5b82a3517ec716b8436d10c1c630d453e4b77bd4af332b992548-merged.mount: Deactivated successfully.
Dec 03 21:20:03 compute-0 podman[159465]: 2025-12-03 21:20:03.100023805 +0000 UTC m=+0.825740820 container remove be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_euler, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:20:03 compute-0 systemd[1]: libpod-conmon-be88db6e62308f314a85e2baec9c71812221cad5ffd101f6dbe06af3c6dbe890.scope: Deactivated successfully.
Dec 03 21:20:03 compute-0 sudo[159390]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:03 compute-0 sudo[159513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:20:03 compute-0 sudo[159513]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:03 compute-0 sudo[159513]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:03 compute-0 sudo[159538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:20:03 compute-0 sudo[159538]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v404: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:03 compute-0 podman[159576]: 2025-12-03 21:20:03.633613181 +0000 UTC m=+0.073707041 container create d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_jones, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:20:03 compute-0 systemd[1]: Started libpod-conmon-d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66.scope.
Dec 03 21:20:03 compute-0 podman[159576]: 2025-12-03 21:20:03.605396167 +0000 UTC m=+0.045490077 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:20:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:20:03 compute-0 podman[159576]: 2025-12-03 21:20:03.738328462 +0000 UTC m=+0.178422302 container init d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_jones, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:20:03 compute-0 podman[159576]: 2025-12-03 21:20:03.750188539 +0000 UTC m=+0.190282399 container start d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_jones, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:20:03 compute-0 podman[159576]: 2025-12-03 21:20:03.754936506 +0000 UTC m=+0.195030366 container attach d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_jones, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:20:03 compute-0 stupefied_jones[159593]: 167 167
Dec 03 21:20:03 compute-0 podman[159576]: 2025-12-03 21:20:03.757198456 +0000 UTC m=+0.197292316 container died d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_jones, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:20:03 compute-0 systemd[1]: libpod-d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66.scope: Deactivated successfully.
Dec 03 21:20:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-872a2c0d673d6a6e1490f2b993ba1be2a663fa916ed156078b2988eb40fd763e-merged.mount: Deactivated successfully.
Dec 03 21:20:03 compute-0 podman[159576]: 2025-12-03 21:20:03.820814527 +0000 UTC m=+0.260908387 container remove d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_jones, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0)
Dec 03 21:20:03 compute-0 systemd[1]: libpod-conmon-d82c8b647b0dc487345142831ae2fc8d8d2ab61c8f3d18454a89b282d5d44c66.scope: Deactivated successfully.
Dec 03 21:20:04 compute-0 podman[159617]: 2025-12-03 21:20:04.058970734 +0000 UTC m=+0.071731428 container create 609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_davinci, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:20:04 compute-0 systemd[1]: Started libpod-conmon-609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08.scope.
Dec 03 21:20:04 compute-0 podman[159617]: 2025-12-03 21:20:04.029010284 +0000 UTC m=+0.041771038 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:20:04 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:20:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd828d11d900fc2f746b6ca4d5d0eda44ea7410e223e2c3fb70061cd52c8c297/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd828d11d900fc2f746b6ca4d5d0eda44ea7410e223e2c3fb70061cd52c8c297/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd828d11d900fc2f746b6ca4d5d0eda44ea7410e223e2c3fb70061cd52c8c297/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd828d11d900fc2f746b6ca4d5d0eda44ea7410e223e2c3fb70061cd52c8c297/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:04 compute-0 podman[159617]: 2025-12-03 21:20:04.232195456 +0000 UTC m=+0.244956170 container init 609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_davinci, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:20:04 compute-0 podman[159617]: 2025-12-03 21:20:04.239087601 +0000 UTC m=+0.251848305 container start 609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_davinci, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:20:04 compute-0 podman[159617]: 2025-12-03 21:20:04.242730038 +0000 UTC m=+0.255490742 container attach 609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_davinci, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:20:04 compute-0 pensive_davinci[159633]: {
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:     "0": [
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:         {
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "devices": [
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "/dev/loop3"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             ],
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_name": "ceph_lv0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_size": "21470642176",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "name": "ceph_lv0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "tags": {
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cluster_name": "ceph",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.crush_device_class": "",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.encrypted": "0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.objectstore": "bluestore",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osd_id": "0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.type": "block",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.vdo": "0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.with_tpm": "0"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             },
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "type": "block",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "vg_name": "ceph_vg0"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:         }
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:     ],
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:     "1": [
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:         {
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "devices": [
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "/dev/loop4"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             ],
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_name": "ceph_lv1",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_size": "21470642176",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "name": "ceph_lv1",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "tags": {
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cluster_name": "ceph",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.crush_device_class": "",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.encrypted": "0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.objectstore": "bluestore",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osd_id": "1",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.type": "block",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.vdo": "0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.with_tpm": "0"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             },
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "type": "block",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "vg_name": "ceph_vg1"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:         }
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:     ],
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:     "2": [
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:         {
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "devices": [
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "/dev/loop5"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             ],
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_name": "ceph_lv2",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_size": "21470642176",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "name": "ceph_lv2",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "tags": {
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.cluster_name": "ceph",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.crush_device_class": "",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.encrypted": "0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.objectstore": "bluestore",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osd_id": "2",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.type": "block",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.vdo": "0",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:                 "ceph.with_tpm": "0"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             },
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "type": "block",
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:             "vg_name": "ceph_vg2"
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:         }
Dec 03 21:20:04 compute-0 pensive_davinci[159633]:     ]
Dec 03 21:20:04 compute-0 pensive_davinci[159633]: }
Dec 03 21:20:04 compute-0 systemd[1]: libpod-609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08.scope: Deactivated successfully.
Dec 03 21:20:04 compute-0 podman[159617]: 2025-12-03 21:20:04.573259936 +0000 UTC m=+0.586020610 container died 609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_davinci, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:20:04 compute-0 ceph-mon[75204]: pgmap v404: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd828d11d900fc2f746b6ca4d5d0eda44ea7410e223e2c3fb70061cd52c8c297-merged.mount: Deactivated successfully.
Dec 03 21:20:04 compute-0 podman[159617]: 2025-12-03 21:20:04.984046699 +0000 UTC m=+0.996807393 container remove 609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_davinci, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:20:04 compute-0 systemd[1]: libpod-conmon-609e5506fc27820ad7bdad44f9427bc4b4b60e3c6a4d4124c7ca9a975833cd08.scope: Deactivated successfully.
Dec 03 21:20:05 compute-0 sudo[159538]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:05 compute-0 sudo[159662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:20:05 compute-0 sudo[159662]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:05 compute-0 sudo[159662]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:05 compute-0 kernel: SELinux:  Converting 2769 SID table entries...
Dec 03 21:20:05 compute-0 sudo[159687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:20:05 compute-0 sudo[159687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:05 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 21:20:05 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 03 21:20:05 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 21:20:05 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 03 21:20:05 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 21:20:05 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 21:20:05 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 21:20:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v405: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:05 compute-0 podman[159727]: 2025-12-03 21:20:05.558853958 +0000 UTC m=+0.082840236 container create 8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_kowalevski, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 03 21:20:05 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec 03 21:20:05 compute-0 podman[159727]: 2025-12-03 21:20:05.516145666 +0000 UTC m=+0.040132004 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:20:05 compute-0 systemd[1]: Started libpod-conmon-8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa.scope.
Dec 03 21:20:05 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:20:05 compute-0 podman[159727]: 2025-12-03 21:20:05.680505271 +0000 UTC m=+0.204491599 container init 8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_kowalevski, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3)
Dec 03 21:20:05 compute-0 podman[159727]: 2025-12-03 21:20:05.691147565 +0000 UTC m=+0.215133813 container start 8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_kowalevski, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:20:05 compute-0 podman[159727]: 2025-12-03 21:20:05.695610654 +0000 UTC m=+0.219597002 container attach 8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_kowalevski, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:20:05 compute-0 youthful_kowalevski[159743]: 167 167
Dec 03 21:20:05 compute-0 systemd[1]: libpod-8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa.scope: Deactivated successfully.
Dec 03 21:20:05 compute-0 podman[159727]: 2025-12-03 21:20:05.700066494 +0000 UTC m=+0.224052802 container died 8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_kowalevski, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:20:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-a152a6800551a26a5c491feceb42fb3cf7291ec9d4bf59801e80a0e69c81734f-merged.mount: Deactivated successfully.
Dec 03 21:20:05 compute-0 podman[159727]: 2025-12-03 21:20:05.751287273 +0000 UTC m=+0.275273531 container remove 8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_kowalevski, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:20:05 compute-0 systemd[1]: libpod-conmon-8cea69ff0e006d0b6c4a82280279fef7efdba44802493c21bee7990a72b6f0fa.scope: Deactivated successfully.
Dec 03 21:20:05 compute-0 ceph-mon[75204]: pgmap v405: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:05 compute-0 podman[159768]: 2025-12-03 21:20:05.955601816 +0000 UTC m=+0.066743256 container create b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_cannon, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:20:06 compute-0 systemd[1]: Started libpod-conmon-b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff.scope.
Dec 03 21:20:06 compute-0 podman[159768]: 2025-12-03 21:20:05.929353944 +0000 UTC m=+0.040495474 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:20:06 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:20:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adeffc4b78eeb92b775884d0e54b73ae2566810a0d8d8a814f3de9ee3b46a3b6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adeffc4b78eeb92b775884d0e54b73ae2566810a0d8d8a814f3de9ee3b46a3b6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adeffc4b78eeb92b775884d0e54b73ae2566810a0d8d8a814f3de9ee3b46a3b6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adeffc4b78eeb92b775884d0e54b73ae2566810a0d8d8a814f3de9ee3b46a3b6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:20:06 compute-0 podman[159768]: 2025-12-03 21:20:06.059171225 +0000 UTC m=+0.170312705 container init b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_cannon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:20:06 compute-0 podman[159768]: 2025-12-03 21:20:06.073945441 +0000 UTC m=+0.185086901 container start b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_cannon, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:20:06 compute-0 podman[159768]: 2025-12-03 21:20:06.078390809 +0000 UTC m=+0.189532289 container attach b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_cannon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 03 21:20:06 compute-0 lvm[159862]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:20:06 compute-0 lvm[159865]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:20:06 compute-0 lvm[159866]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:20:06 compute-0 lvm[159866]: VG ceph_vg2 finished
Dec 03 21:20:06 compute-0 lvm[159862]: VG ceph_vg0 finished
Dec 03 21:20:06 compute-0 lvm[159865]: VG ceph_vg1 finished
Dec 03 21:20:06 compute-0 youthful_cannon[159784]: {}
Dec 03 21:20:06 compute-0 systemd[1]: libpod-b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff.scope: Deactivated successfully.
Dec 03 21:20:06 compute-0 systemd[1]: libpod-b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff.scope: Consumed 1.285s CPU time.
Dec 03 21:20:06 compute-0 podman[159768]: 2025-12-03 21:20:06.89599732 +0000 UTC m=+1.007138780 container died b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_cannon, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:20:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-adeffc4b78eeb92b775884d0e54b73ae2566810a0d8d8a814f3de9ee3b46a3b6-merged.mount: Deactivated successfully.
Dec 03 21:20:06 compute-0 podman[159768]: 2025-12-03 21:20:06.954592407 +0000 UTC m=+1.065733847 container remove b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=youthful_cannon, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:20:06 compute-0 systemd[1]: libpod-conmon-b993f3c2ff8a772e6c9327b715a6bc378d00e0792c03a60a7d43d03ad6d185ff.scope: Deactivated successfully.
Dec 03 21:20:07 compute-0 sudo[159687]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:20:07 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:20:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:20:07 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:20:07 compute-0 sudo[159883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:20:07 compute-0 sudo[159883]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:20:07 compute-0 sudo[159883]: pam_unix(sudo:session): session closed for user root
Dec 03 21:20:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v406: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:08 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:20:08 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:20:08 compute-0 ceph-mon[75204]: pgmap v406: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:08 compute-0 podman[159908]: 2025-12-03 21:20:08.169795679 +0000 UTC m=+0.106797596 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 03 21:20:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v407: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:10 compute-0 ceph-mon[75204]: pgmap v407: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v408: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:12 compute-0 ceph-mon[75204]: pgmap v408: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v409: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:14 compute-0 ceph-mon[75204]: pgmap v409: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:15 compute-0 kernel: SELinux:  Converting 2769 SID table entries...
Dec 03 21:20:15 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 21:20:15 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 03 21:20:15 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 21:20:15 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 03 21:20:15 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 21:20:15 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 21:20:15 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 21:20:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v410: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:16 compute-0 ceph-mon[75204]: pgmap v410: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v411: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:18 compute-0 ceph-mon[75204]: pgmap v411: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v412: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:19 compute-0 ceph-mon[75204]: pgmap v412: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:20 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 03 21:20:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:20 compute-0 podman[159943]: 2025-12-03 21:20:20.173212853 +0000 UTC m=+0.093954593 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:20:21
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['backups', 'cephfs.cephfs.data', '.mgr', 'images', 'cephfs.cephfs.meta', 'volumes', 'vms']
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v413: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:20:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:20:22 compute-0 ceph-mon[75204]: pgmap v413: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v414: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:24 compute-0 ceph-mon[75204]: pgmap v414: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v415: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:26 compute-0 ceph-mon[75204]: pgmap v415: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v416: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:20:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:20:28 compute-0 ceph-mon[75204]: pgmap v416: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v417: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:30 compute-0 ceph-mon[75204]: pgmap v417: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v418: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:32 compute-0 ceph-mon[75204]: pgmap v418: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v419: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:34 compute-0 ceph-mon[75204]: pgmap v419: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v420: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:36 compute-0 ceph-mon[75204]: pgmap v420: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v421: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:38 compute-0 ceph-mon[75204]: pgmap v421: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:39 compute-0 podman[165653]: 2025-12-03 21:20:39.17427111 +0000 UTC m=+0.113998159 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:20:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v422: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:40 compute-0 ceph-mon[75204]: pgmap v422: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v423: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:42 compute-0 ceph-mon[75204]: pgmap v423: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v424: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:44 compute-0 ceph-mon[75204]: pgmap v424: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v425: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:45 compute-0 ceph-mon[75204]: pgmap v425: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v426: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:48 compute-0 ceph-mon[75204]: pgmap v426: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:20:48.924 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:20:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:20:48.924 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:20:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:20:48.924 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:20:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v427: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:50 compute-0 ceph-mon[75204]: pgmap v427: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:51 compute-0 podman[171012]: 2025-12-03 21:20:51.139199394 +0000 UTC m=+0.070190687 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 03 21:20:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v428: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:20:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:20:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:20:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:20:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:20:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:20:52 compute-0 ceph-mon[75204]: pgmap v428: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v429: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:54 compute-0 ceph-mon[75204]: pgmap v429: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:20:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v430: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:56 compute-0 ceph-mon[75204]: pgmap v430: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v431: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:57 compute-0 ceph-mon[75204]: pgmap v431: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:20:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v432: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:00 compute-0 ceph-mon[75204]: pgmap v432: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v433: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:02 compute-0 ceph-mon[75204]: pgmap v433: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v434: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:04 compute-0 ceph-mon[75204]: pgmap v434: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v435: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:06 compute-0 ceph-mon[75204]: pgmap v435: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:07 compute-0 sudo[176815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:21:07 compute-0 sudo[176815]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:07 compute-0 sudo[176815]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:07 compute-0 sudo[176840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:21:07 compute-0 sudo[176840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v436: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:07 compute-0 podman[176911]: 2025-12-03 21:21:07.86965549 +0000 UTC m=+0.089929031 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:21:07 compute-0 podman[176911]: 2025-12-03 21:21:07.996523671 +0000 UTC m=+0.216797242 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:21:08 compute-0 ceph-mon[75204]: pgmap v436: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:08 compute-0 sudo[176840]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:21:08 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:21:08 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:08 compute-0 sudo[177074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:21:08 compute-0 sudo[177074]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:08 compute-0 sudo[177074]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:08 compute-0 sudo[177099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:21:08 compute-0 sudo[177099]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v437: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:09 compute-0 sudo[177099]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:21:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:21:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:21:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:21:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:21:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:21:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:21:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:21:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:21:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:21:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:21:09 compute-0 sudo[177155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:21:09 compute-0 sudo[177155]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:09 compute-0 sudo[177155]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:09 compute-0 sudo[177186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:21:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:09 compute-0 ceph-mon[75204]: pgmap v437: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:21:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:21:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:21:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:21:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:21:09 compute-0 sudo[177186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:09 compute-0 podman[177179]: 2025-12-03 21:21:09.872007314 +0000 UTC m=+0.124209220 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:21:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:10 compute-0 podman[177242]: 2025-12-03 21:21:10.166499372 +0000 UTC m=+0.064673225 container create 99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_ptolemy, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:21:10 compute-0 systemd[1]: Started libpod-conmon-99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3.scope.
Dec 03 21:21:10 compute-0 podman[177242]: 2025-12-03 21:21:10.137769723 +0000 UTC m=+0.035943616 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:21:10 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:21:10 compute-0 podman[177242]: 2025-12-03 21:21:10.27590788 +0000 UTC m=+0.174081793 container init 99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_ptolemy, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle)
Dec 03 21:21:10 compute-0 podman[177242]: 2025-12-03 21:21:10.289019596 +0000 UTC m=+0.187193429 container start 99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_ptolemy, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:21:10 compute-0 podman[177242]: 2025-12-03 21:21:10.293140688 +0000 UTC m=+0.191314591 container attach 99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_ptolemy, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 03 21:21:10 compute-0 practical_ptolemy[177258]: 167 167
Dec 03 21:21:10 compute-0 systemd[1]: libpod-99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3.scope: Deactivated successfully.
Dec 03 21:21:10 compute-0 podman[177242]: 2025-12-03 21:21:10.298959616 +0000 UTC m=+0.197133479 container died 99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_ptolemy, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:21:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-46d857fd9e8223772005d101da5a22ed3d646476fc16c27cfe7102e0b792673f-merged.mount: Deactivated successfully.
Dec 03 21:21:10 compute-0 podman[177242]: 2025-12-03 21:21:10.355385756 +0000 UTC m=+0.253559609 container remove 99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_ptolemy, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:21:10 compute-0 systemd[1]: libpod-conmon-99ae0296353e733a3610ed2a590a4bf1e3583613aa8974726f03a3912bf0e2b3.scope: Deactivated successfully.
Dec 03 21:21:10 compute-0 podman[177282]: 2025-12-03 21:21:10.536498218 +0000 UTC m=+0.039648146 container create 24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=fervent_lewin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 03 21:21:10 compute-0 systemd[1]: Started libpod-conmon-24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a.scope.
Dec 03 21:21:10 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:21:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adb37f58d73bb96add7ef54cf22cbe5047a6f4d565aa7071704a663dc96c4fbb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adb37f58d73bb96add7ef54cf22cbe5047a6f4d565aa7071704a663dc96c4fbb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adb37f58d73bb96add7ef54cf22cbe5047a6f4d565aa7071704a663dc96c4fbb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adb37f58d73bb96add7ef54cf22cbe5047a6f4d565aa7071704a663dc96c4fbb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/adb37f58d73bb96add7ef54cf22cbe5047a6f4d565aa7071704a663dc96c4fbb/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:10 compute-0 podman[177282]: 2025-12-03 21:21:10.519140098 +0000 UTC m=+0.022290056 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:21:10 compute-0 podman[177282]: 2025-12-03 21:21:10.621442393 +0000 UTC m=+0.124592381 container init 24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=fervent_lewin, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:21:10 compute-0 podman[177282]: 2025-12-03 21:21:10.626552542 +0000 UTC m=+0.129702510 container start 24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=fervent_lewin, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:21:10 compute-0 podman[177282]: 2025-12-03 21:21:10.630744765 +0000 UTC m=+0.133894733 container attach 24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=fervent_lewin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:21:11 compute-0 fervent_lewin[177298]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:21:11 compute-0 fervent_lewin[177298]: --> All data devices are unavailable
Dec 03 21:21:11 compute-0 systemd[1]: libpod-24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a.scope: Deactivated successfully.
Dec 03 21:21:11 compute-0 podman[177282]: 2025-12-03 21:21:11.208034705 +0000 UTC m=+0.711184703 container died 24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=fervent_lewin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:21:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-adb37f58d73bb96add7ef54cf22cbe5047a6f4d565aa7071704a663dc96c4fbb-merged.mount: Deactivated successfully.
Dec 03 21:21:11 compute-0 podman[177282]: 2025-12-03 21:21:11.278844845 +0000 UTC m=+0.781994813 container remove 24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=fervent_lewin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:21:11 compute-0 systemd[1]: libpod-conmon-24ac335ea57eb37bed316a0080299568390d88c8b8f613a515057509f0254e7a.scope: Deactivated successfully.
Dec 03 21:21:11 compute-0 sudo[177186]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v438: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:11 compute-0 sudo[177332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:21:11 compute-0 sudo[177332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:11 compute-0 sudo[177332]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:11 compute-0 sudo[177357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:21:11 compute-0 sudo[177357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:11 compute-0 podman[177394]: 2025-12-03 21:21:11.879792447 +0000 UTC m=+0.070485533 container create db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=wizardly_murdock, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:21:11 compute-0 systemd[1]: Started libpod-conmon-db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52.scope.
Dec 03 21:21:11 compute-0 podman[177394]: 2025-12-03 21:21:11.851218802 +0000 UTC m=+0.041911708 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:21:11 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:21:11 compute-0 podman[177394]: 2025-12-03 21:21:11.979007687 +0000 UTC m=+0.169700543 container init db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=wizardly_murdock, org.label-schema.build-date=20251030, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:21:11 compute-0 podman[177394]: 2025-12-03 21:21:11.988356582 +0000 UTC m=+0.179049408 container start db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=wizardly_murdock, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030)
Dec 03 21:21:11 compute-0 podman[177394]: 2025-12-03 21:21:11.993032638 +0000 UTC m=+0.183725464 container attach db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=wizardly_murdock, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030)
Dec 03 21:21:11 compute-0 wizardly_murdock[177410]: 167 167
Dec 03 21:21:11 compute-0 systemd[1]: libpod-db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52.scope: Deactivated successfully.
Dec 03 21:21:11 compute-0 podman[177394]: 2025-12-03 21:21:11.995752842 +0000 UTC m=+0.186445728 container died db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=wizardly_murdock, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:21:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-44fbb91765d700a77d987c0e23575b79a32e68b4bdb865dbf2b470630cb2c590-merged.mount: Deactivated successfully.
Dec 03 21:21:12 compute-0 podman[177394]: 2025-12-03 21:21:12.046935141 +0000 UTC m=+0.237627957 container remove db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=wizardly_murdock, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:21:12 compute-0 systemd[1]: libpod-conmon-db63a9debb38af9bd72a0f7f64c1d2820d052d40487fd0cbc5819836ee80cc52.scope: Deactivated successfully.
Dec 03 21:21:12 compute-0 podman[177434]: 2025-12-03 21:21:12.308138116 +0000 UTC m=+0.060873493 container create 6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_wu, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:21:12 compute-0 systemd[1]: Started libpod-conmon-6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112.scope.
Dec 03 21:21:12 compute-0 podman[177434]: 2025-12-03 21:21:12.285016089 +0000 UTC m=+0.037751436 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:21:12 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:21:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a671ede9176e64498f1143b04ca399048fead870ce16516053f73f051ed659fa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a671ede9176e64498f1143b04ca399048fead870ce16516053f73f051ed659fa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a671ede9176e64498f1143b04ca399048fead870ce16516053f73f051ed659fa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a671ede9176e64498f1143b04ca399048fead870ce16516053f73f051ed659fa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:12 compute-0 podman[177434]: 2025-12-03 21:21:12.421441839 +0000 UTC m=+0.174177276 container init 6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_wu, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 03 21:21:12 compute-0 podman[177434]: 2025-12-03 21:21:12.431295997 +0000 UTC m=+0.184031374 container start 6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_wu, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 03 21:21:12 compute-0 podman[177434]: 2025-12-03 21:21:12.437217087 +0000 UTC m=+0.189952464 container attach 6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_wu, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:21:12 compute-0 ceph-mon[75204]: pgmap v438: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:12 compute-0 crazy_wu[177451]: {
Dec 03 21:21:12 compute-0 crazy_wu[177451]:     "0": [
Dec 03 21:21:12 compute-0 crazy_wu[177451]:         {
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "devices": [
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "/dev/loop3"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             ],
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_name": "ceph_lv0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_size": "21470642176",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "name": "ceph_lv0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "tags": {
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cluster_name": "ceph",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.crush_device_class": "",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.encrypted": "0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.objectstore": "bluestore",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osd_id": "0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.type": "block",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.vdo": "0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.with_tpm": "0"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             },
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "type": "block",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "vg_name": "ceph_vg0"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:         }
Dec 03 21:21:12 compute-0 crazy_wu[177451]:     ],
Dec 03 21:21:12 compute-0 crazy_wu[177451]:     "1": [
Dec 03 21:21:12 compute-0 crazy_wu[177451]:         {
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "devices": [
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "/dev/loop4"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             ],
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_name": "ceph_lv1",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_size": "21470642176",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "name": "ceph_lv1",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "tags": {
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cluster_name": "ceph",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.crush_device_class": "",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.encrypted": "0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.objectstore": "bluestore",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osd_id": "1",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.type": "block",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.vdo": "0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.with_tpm": "0"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             },
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "type": "block",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "vg_name": "ceph_vg1"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:         }
Dec 03 21:21:12 compute-0 crazy_wu[177451]:     ],
Dec 03 21:21:12 compute-0 crazy_wu[177451]:     "2": [
Dec 03 21:21:12 compute-0 crazy_wu[177451]:         {
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "devices": [
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "/dev/loop5"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             ],
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_name": "ceph_lv2",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_size": "21470642176",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "name": "ceph_lv2",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "tags": {
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.cluster_name": "ceph",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.crush_device_class": "",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.encrypted": "0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.objectstore": "bluestore",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osd_id": "2",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.type": "block",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.vdo": "0",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:                 "ceph.with_tpm": "0"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             },
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "type": "block",
Dec 03 21:21:12 compute-0 crazy_wu[177451]:             "vg_name": "ceph_vg2"
Dec 03 21:21:12 compute-0 crazy_wu[177451]:         }
Dec 03 21:21:12 compute-0 crazy_wu[177451]:     ]
Dec 03 21:21:12 compute-0 crazy_wu[177451]: }
Dec 03 21:21:12 compute-0 systemd[1]: libpod-6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112.scope: Deactivated successfully.
Dec 03 21:21:12 compute-0 podman[177434]: 2025-12-03 21:21:12.761826262 +0000 UTC m=+0.514561659 container died 6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_wu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:21:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-a671ede9176e64498f1143b04ca399048fead870ce16516053f73f051ed659fa-merged.mount: Deactivated successfully.
Dec 03 21:21:12 compute-0 podman[177434]: 2025-12-03 21:21:12.82183554 +0000 UTC m=+0.574570917 container remove 6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_wu, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:21:12 compute-0 systemd[1]: libpod-conmon-6dd71f86c1c70b5311eda3e7bd9c02607bb5dd6ccbc579920f336702c3f49112.scope: Deactivated successfully.
Dec 03 21:21:12 compute-0 sudo[177357]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:12 compute-0 sudo[177473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:21:12 compute-0 sudo[177473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:12 compute-0 sudo[177473]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:13 compute-0 sudo[177498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:21:13 compute-0 sudo[177498]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v439: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:13 compute-0 podman[177535]: 2025-12-03 21:21:13.436981416 +0000 UTC m=+0.073547936 container create 30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_chebyshev, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle)
Dec 03 21:21:13 compute-0 systemd[1]: Started libpod-conmon-30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c.scope.
Dec 03 21:21:13 compute-0 podman[177535]: 2025-12-03 21:21:13.406214692 +0000 UTC m=+0.042781262 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:21:13 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:21:13 compute-0 podman[177535]: 2025-12-03 21:21:13.557985508 +0000 UTC m=+0.194552038 container init 30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_chebyshev, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:21:13 compute-0 podman[177535]: 2025-12-03 21:21:13.569079519 +0000 UTC m=+0.205646049 container start 30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_chebyshev, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030)
Dec 03 21:21:13 compute-0 podman[177535]: 2025-12-03 21:21:13.573010596 +0000 UTC m=+0.209577126 container attach 30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_chebyshev, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:21:13 compute-0 loving_chebyshev[177552]: 167 167
Dec 03 21:21:13 compute-0 systemd[1]: libpod-30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c.scope: Deactivated successfully.
Dec 03 21:21:13 compute-0 podman[177535]: 2025-12-03 21:21:13.577665932 +0000 UTC m=+0.214232462 container died 30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_chebyshev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:21:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-740d42e8f25e2c600de5e718fbe21f75d7e5c227a40081bc255eb2bd6cec3ceb-merged.mount: Deactivated successfully.
Dec 03 21:21:13 compute-0 podman[177535]: 2025-12-03 21:21:13.625728986 +0000 UTC m=+0.262295516 container remove 30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=loving_chebyshev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:21:13 compute-0 systemd[1]: libpod-conmon-30c50648a4cc6b85017e1817d64849e5c055fba4f38cd7db78e14f02c4561c5c.scope: Deactivated successfully.
Dec 03 21:21:13 compute-0 podman[177577]: 2025-12-03 21:21:13.880805965 +0000 UTC m=+0.067654696 container create 3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_hodgkin, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:21:13 compute-0 systemd[1]: Started libpod-conmon-3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3.scope.
Dec 03 21:21:13 compute-0 podman[177577]: 2025-12-03 21:21:13.856821015 +0000 UTC m=+0.043669796 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:21:13 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:21:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c104b44d8859697a66f0b5a2b5c538138a2394f92183fcc81cb2460ec602170c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c104b44d8859697a66f0b5a2b5c538138a2394f92183fcc81cb2460ec602170c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c104b44d8859697a66f0b5a2b5c538138a2394f92183fcc81cb2460ec602170c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c104b44d8859697a66f0b5a2b5c538138a2394f92183fcc81cb2460ec602170c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:21:13 compute-0 podman[177577]: 2025-12-03 21:21:13.995011263 +0000 UTC m=+0.181860034 container init 3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_hodgkin, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:21:14 compute-0 podman[177577]: 2025-12-03 21:21:14.011366137 +0000 UTC m=+0.198214898 container start 3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_hodgkin, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:21:14 compute-0 podman[177577]: 2025-12-03 21:21:14.015849259 +0000 UTC m=+0.202698020 container attach 3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_hodgkin, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:21:14 compute-0 lvm[177679]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:21:14 compute-0 lvm[177679]: VG ceph_vg2 finished
Dec 03 21:21:14 compute-0 lvm[177677]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:21:14 compute-0 lvm[177677]: VG ceph_vg1 finished
Dec 03 21:21:14 compute-0 lvm[177676]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:21:14 compute-0 lvm[177676]: VG ceph_vg0 finished
Dec 03 21:21:14 compute-0 charming_hodgkin[177594]: {}
Dec 03 21:21:14 compute-0 systemd[1]: libpod-3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3.scope: Deactivated successfully.
Dec 03 21:21:14 compute-0 systemd[1]: libpod-3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3.scope: Consumed 1.373s CPU time.
Dec 03 21:21:14 compute-0 podman[177577]: 2025-12-03 21:21:14.852302747 +0000 UTC m=+1.039151508 container died 3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_hodgkin, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20251030)
Dec 03 21:21:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v440: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:15 compute-0 ceph-mon[75204]: pgmap v439: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-c104b44d8859697a66f0b5a2b5c538138a2394f92183fcc81cb2460ec602170c-merged.mount: Deactivated successfully.
Dec 03 21:21:16 compute-0 podman[177577]: 2025-12-03 21:21:16.269347646 +0000 UTC m=+2.456196407 container remove 3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_hodgkin, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:21:16 compute-0 sudo[177498]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:16 compute-0 systemd[1]: libpod-conmon-3c2163286f15650ce65b35d54b957b9ec4dbc3c5fd5d4b159b772433299a90b3.scope: Deactivated successfully.
Dec 03 21:21:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:21:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:21:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:16 compute-0 sudo[177694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:21:16 compute-0 sudo[177694]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:21:16 compute-0 sudo[177694]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:16 compute-0 kernel: SELinux:  Converting 2770 SID table entries...
Dec 03 21:21:16 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 03 21:21:16 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 03 21:21:16 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 03 21:21:16 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 03 21:21:16 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 03 21:21:16 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 03 21:21:16 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 03 21:21:16 compute-0 ceph-mon[75204]: pgmap v440: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:16 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:16 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:21:17 compute-0 groupadd[177727]: group added to /etc/group: name=dnsmasq, GID=991
Dec 03 21:21:17 compute-0 groupadd[177727]: group added to /etc/gshadow: name=dnsmasq
Dec 03 21:21:17 compute-0 groupadd[177727]: new group: name=dnsmasq, GID=991
Dec 03 21:21:17 compute-0 useradd[177734]: new user: name=dnsmasq, UID=991, GID=991, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 03 21:21:17 compute-0 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Dec 03 21:21:17 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 03 21:21:17 compute-0 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Dec 03 21:21:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v441: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:17 compute-0 ceph-mon[75204]: pgmap v441: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:18 compute-0 groupadd[177747]: group added to /etc/group: name=clevis, GID=990
Dec 03 21:21:19 compute-0 groupadd[177747]: group added to /etc/gshadow: name=clevis
Dec 03 21:21:19 compute-0 groupadd[177747]: new group: name=clevis, GID=990
Dec 03 21:21:19 compute-0 useradd[177754]: new user: name=clevis, UID=990, GID=990, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 03 21:21:19 compute-0 usermod[177764]: add 'clevis' to group 'tss'
Dec 03 21:21:19 compute-0 usermod[177764]: add 'clevis' to shadow group 'tss'
Dec 03 21:21:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v442: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:19 compute-0 ceph-mon[75204]: pgmap v442: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:21:21
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.data', 'volumes', 'images', 'vms', '.mgr', 'cephfs.cephfs.meta', 'backups']
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:21:21 compute-0 polkitd[43470]: Reloading rules
Dec 03 21:21:21 compute-0 polkitd[43470]: Collecting garbage unconditionally...
Dec 03 21:21:21 compute-0 polkitd[43470]: Loading rules from directory /etc/polkit-1/rules.d
Dec 03 21:21:21 compute-0 polkitd[43470]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 03 21:21:21 compute-0 polkitd[43470]: Finished loading, compiling and executing 3 rules
Dec 03 21:21:21 compute-0 polkitd[43470]: Reloading rules
Dec 03 21:21:21 compute-0 polkitd[43470]: Collecting garbage unconditionally...
Dec 03 21:21:21 compute-0 polkitd[43470]: Loading rules from directory /etc/polkit-1/rules.d
Dec 03 21:21:21 compute-0 polkitd[43470]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 03 21:21:21 compute-0 polkitd[43470]: Finished loading, compiling and executing 3 rules
Dec 03 21:21:21 compute-0 podman[177789]: 2025-12-03 21:21:21.869074581 +0000 UTC m=+0.088444810 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 03 21:21:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v443: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:22 compute-0 ceph-mon[75204]: pgmap v443: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v444: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:24 compute-0 ceph-mon[75204]: pgmap v444: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v445: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:25 compute-0 ceph-mon[75204]: pgmap v445: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:26 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Dec 03 21:21:26 compute-0 sshd[1007]: Received signal 15; terminating.
Dec 03 21:21:26 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Dec 03 21:21:26 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Dec 03 21:21:26 compute-0 systemd[1]: sshd.service: Consumed 3.464s CPU time, read 32.0K from disk, written 20.0K to disk.
Dec 03 21:21:26 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Dec 03 21:21:26 compute-0 systemd[1]: Stopping sshd-keygen.target...
Dec 03 21:21:26 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 03 21:21:26 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 03 21:21:26 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 03 21:21:26 compute-0 systemd[1]: Reached target sshd-keygen.target.
Dec 03 21:21:26 compute-0 systemd[1]: Starting OpenSSH server daemon...
Dec 03 21:21:26 compute-0 sshd[178579]: Server listening on 0.0.0.0 port 22.
Dec 03 21:21:26 compute-0 sshd[178579]: Server listening on :: port 22.
Dec 03 21:21:26 compute-0 systemd[1]: Started OpenSSH server daemon.
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:21:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v446: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:27 compute-0 ceph-mon[75204]: pgmap v446: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:29 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 21:21:29 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 03 21:21:29 compute-0 systemd[1]: Reloading.
Dec 03 21:21:29 compute-0 systemd-sysv-generator[178842]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:29 compute-0 systemd-rc-local-generator[178838]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v447: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:30 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 03 21:21:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:31 compute-0 ceph-mon[75204]: pgmap v447: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v448: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:32 compute-0 sudo[159045]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:33 compute-0 sudo[182369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvepjoojcocqwkqrdmijmzfkckyvjish ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796892.7839189-336-269127053106960/AnsiballZ_systemd.py'
Dec 03 21:21:33 compute-0 sudo[182369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:33 compute-0 ceph-mon[75204]: pgmap v448: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:33 compute-0 python3.9[182408]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:21:33 compute-0 systemd[1]: Reloading.
Dec 03 21:21:33 compute-0 systemd-sysv-generator[182780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:33 compute-0 systemd-rc-local-generator[182774]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v449: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:34 compute-0 sudo[182369]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:34 compute-0 sudo[183637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flppkjmyeyynzcklgxnvzprapprglazf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796894.4056094-336-231141234231529/AnsiballZ_systemd.py'
Dec 03 21:21:34 compute-0 sudo[183637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:35 compute-0 python3.9[183655]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:21:35 compute-0 systemd[1]: Reloading.
Dec 03 21:21:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:35 compute-0 systemd-sysv-generator[184043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:35 compute-0 systemd-rc-local-generator[184040]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:35 compute-0 sudo[183637]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:35 compute-0 ceph-mon[75204]: pgmap v449: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v450: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:35 compute-0 sudo[184914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hunlpufkxpixejavxeyyuvsnwitkngtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796895.5774071-336-242558434115603/AnsiballZ_systemd.py'
Dec 03 21:21:35 compute-0 sudo[184914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:36 compute-0 python3.9[184938]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:21:36 compute-0 systemd[1]: Reloading.
Dec 03 21:21:36 compute-0 systemd-rc-local-generator[185308]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:36 compute-0 systemd-sysv-generator[185311]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:36 compute-0 sudo[184914]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:37 compute-0 sudo[186103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfiovkppwypgqzcinpfuwudxseptqrun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796896.9632628-336-50557332961368/AnsiballZ_systemd.py'
Dec 03 21:21:37 compute-0 sudo[186103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:37 compute-0 python3.9[186127]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:21:37 compute-0 systemd[1]: Reloading.
Dec 03 21:21:37 compute-0 ceph-mon[75204]: pgmap v450: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:37 compute-0 systemd-sysv-generator[186497]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:37 compute-0 systemd-rc-local-generator[186490]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v451: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:38 compute-0 sudo[186103]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:38 compute-0 sudo[187351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajnjaekmfqpicbovcmiaxutlummwybph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796898.210086-365-110973590461380/AnsiballZ_systemd.py'
Dec 03 21:21:38 compute-0 sudo[187351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:38 compute-0 python3.9[187366]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:39 compute-0 systemd[1]: Reloading.
Dec 03 21:21:39 compute-0 systemd-rc-local-generator[187802]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:39 compute-0 systemd-sysv-generator[187811]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:39 compute-0 sudo[187351]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:39 compute-0 ceph-mon[75204]: pgmap v451: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:39 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 21:21:39 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 03 21:21:39 compute-0 systemd[1]: man-db-cache-update.service: Consumed 12.999s CPU time.
Dec 03 21:21:39 compute-0 systemd[1]: run-rd9af70c2e38147769a682765fb265333.service: Deactivated successfully.
Dec 03 21:21:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v452: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:39 compute-0 sudo[188339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgckhlpkagjypuvfesadqdbngveglygb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796899.5719376-365-259432659197545/AnsiballZ_systemd.py'
Dec 03 21:21:39 compute-0 sudo[188339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:40 compute-0 podman[188341]: 2025-12-03 21:21:40.073000812 +0000 UTC m=+0.128002104 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 21:21:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:40 compute-0 python3.9[188342]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:40 compute-0 systemd[1]: Reloading.
Dec 03 21:21:40 compute-0 systemd-rc-local-generator[188398]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:40 compute-0 systemd-sysv-generator[188402]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:40 compute-0 sudo[188339]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:41 compute-0 sudo[188555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjzatofivhbdkvhoaqghgcbwtydonntp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796900.9226177-365-234173700455549/AnsiballZ_systemd.py'
Dec 03 21:21:41 compute-0 sudo[188555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:41 compute-0 python3.9[188557]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:41 compute-0 ceph-mon[75204]: pgmap v452: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:41 compute-0 systemd[1]: Reloading.
Dec 03 21:21:41 compute-0 systemd-rc-local-generator[188587]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:41 compute-0 systemd-sysv-generator[188591]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v453: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:42 compute-0 sudo[188555]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:42 compute-0 sudo[188745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzpoujbcqzfizwakzrtpbnmiizqeuppr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796902.328778-365-51090892978639/AnsiballZ_systemd.py'
Dec 03 21:21:42 compute-0 sudo[188745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:43 compute-0 python3.9[188747]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:43 compute-0 sudo[188745]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:43 compute-0 sudo[188900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyscwbfkbiijyiawbnmpadfwyyhklxkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796903.2914553-365-112561712168774/AnsiballZ_systemd.py'
Dec 03 21:21:43 compute-0 sudo[188900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:43 compute-0 ceph-mon[75204]: pgmap v453: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v454: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:43 compute-0 python3.9[188902]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:44 compute-0 systemd[1]: Reloading.
Dec 03 21:21:44 compute-0 systemd-sysv-generator[188937]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:44 compute-0 systemd-rc-local-generator[188934]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:44 compute-0 sudo[188900]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:45 compute-0 sudo[189091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcvnjmnrtwxqazolwbjnyaeyfcadeesh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796904.6563973-401-109540219201509/AnsiballZ_systemd.py'
Dec 03 21:21:45 compute-0 sudo[189091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:45 compute-0 python3.9[189093]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 03 21:21:45 compute-0 systemd[1]: Reloading.
Dec 03 21:21:45 compute-0 systemd-rc-local-generator[189124]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:21:45 compute-0 systemd-sysv-generator[189128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:21:45 compute-0 ceph-mon[75204]: pgmap v454: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:45 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 03 21:21:45 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 03 21:21:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v455: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:45 compute-0 sudo[189091]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:46 compute-0 sudo[189285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ospzzhugwlbvuugnxymygtqicryrfzzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796906.201068-409-101515982317453/AnsiballZ_systemd.py'
Dec 03 21:21:46 compute-0 sudo[189285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:46 compute-0 python3.9[189287]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:46 compute-0 sudo[189285]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:47 compute-0 sudo[189440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcckvceqtoyshnwxvragbzueywkxfzxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796907.1614609-409-124257690467349/AnsiballZ_systemd.py'
Dec 03 21:21:47 compute-0 sudo[189440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:47 compute-0 ceph-mon[75204]: pgmap v455: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:47 compute-0 python3.9[189442]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:47 compute-0 sudo[189440]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v456: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:48 compute-0 sudo[189595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgzuetmlabtbqyajeawubnpzmvrqykig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796908.0609818-409-8978344898736/AnsiballZ_systemd.py'
Dec 03 21:21:48 compute-0 sudo[189595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:48 compute-0 python3.9[189597]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:48 compute-0 sudo[189595]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:21:48.925 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:21:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:21:48.927 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:21:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:21:48.928 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:21:49 compute-0 sudo[189750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-domfmndjpwvvuctesqelxdegfhtwmzlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796908.9742856-409-9453255105488/AnsiballZ_systemd.py'
Dec 03 21:21:49 compute-0 sudo[189750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:49 compute-0 python3.9[189752]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:49 compute-0 sudo[189750]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:49 compute-0 ceph-mon[75204]: pgmap v456: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v457: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:50 compute-0 sudo[189905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arybjfgrkvvydrxxxoniqsfiehpjbqez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796909.9207678-409-241699701300264/AnsiballZ_systemd.py'
Dec 03 21:21:50 compute-0 sudo[189905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:50 compute-0 python3.9[189907]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:50 compute-0 sudo[189905]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:51 compute-0 sudo[190060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcnccwwvwbrlhwlcdftdfteuiyvvwvud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796910.9790466-409-242204295002204/AnsiballZ_systemd.py'
Dec 03 21:21:51 compute-0 sudo[190060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:51 compute-0 python3.9[190062]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:21:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:21:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:21:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:21:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:21:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:21:51 compute-0 ceph-mon[75204]: pgmap v457: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:51 compute-0 sudo[190060]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v458: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:52 compute-0 podman[190142]: 2025-12-03 21:21:52.165864036 +0000 UTC m=+0.087880716 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 03 21:21:52 compute-0 sudo[190235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxxcmajjnwbwdycnhfxaelzernjwithw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796911.958428-409-61117189206456/AnsiballZ_systemd.py'
Dec 03 21:21:52 compute-0 sudo[190235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:52 compute-0 python3.9[190237]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:53 compute-0 sudo[190235]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:53 compute-0 ceph-mon[75204]: pgmap v458: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v459: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:54 compute-0 sudo[190390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chhlqyrpswedmbinsymcuvzawgqekwar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796913.968854-409-150699040683013/AnsiballZ_systemd.py'
Dec 03 21:21:54 compute-0 sudo[190390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:54 compute-0 python3.9[190392]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:54 compute-0 sudo[190390]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:55 compute-0 sudo[190545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xetjlfdombpichxqcskuthakdumkstzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796914.717585-409-214773928296704/AnsiballZ_systemd.py'
Dec 03 21:21:55 compute-0 sudo[190545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:21:55 compute-0 python3.9[190547]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:55 compute-0 sudo[190545]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:55 compute-0 ceph-mon[75204]: pgmap v459: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v460: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:55 compute-0 sudo[190700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oozyddpusoyaszpvsraynvdquiqtznvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796915.6090987-409-96229501925975/AnsiballZ_systemd.py'
Dec 03 21:21:55 compute-0 sudo[190700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:56 compute-0 python3.9[190702]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:56 compute-0 sudo[190700]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:56 compute-0 sudo[190855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkqolltjgkgtezoowqrgpppwcvscalum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796916.516944-409-186696296773740/AnsiballZ_systemd.py'
Dec 03 21:21:56 compute-0 sudo[190855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:57 compute-0 python3.9[190857]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:57 compute-0 ceph-mon[75204]: pgmap v460: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v461: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:58 compute-0 sudo[190855]: pam_unix(sudo:session): session closed for user root
Dec 03 21:21:58 compute-0 sudo[191010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reeqadmpiciyypzqmkaawjbyqutrvwhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796918.5478976-409-101780306479591/AnsiballZ_systemd.py'
Dec 03 21:21:58 compute-0 sudo[191010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:21:59 compute-0 python3.9[191012]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:21:59 compute-0 ceph-mon[75204]: pgmap v461: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:21:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v462: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:00 compute-0 sudo[191010]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:00 compute-0 sudo[191165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkjjdfkkczjwqsrkwsjfonmblsahnwur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796920.5632143-409-203381382529574/AnsiballZ_systemd.py'
Dec 03 21:22:00 compute-0 sudo[191165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:01 compute-0 python3.9[191167]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:22:01 compute-0 sudo[191165]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:01 compute-0 ceph-mon[75204]: pgmap v462: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v463: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:02 compute-0 sudo[191320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aswlogzgsamtowywinvpzyndtgvztisg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796921.5585907-409-210038828784472/AnsiballZ_systemd.py'
Dec 03 21:22:02 compute-0 sudo[191320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:02 compute-0 python3.9[191322]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 03 21:22:02 compute-0 sudo[191320]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:03 compute-0 sudo[191475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufrhjmbngckmpsqraspbminesligreaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796922.9321876-511-214807862953663/AnsiballZ_file.py'
Dec 03 21:22:03 compute-0 sudo[191475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:03 compute-0 python3.9[191477]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:22:03 compute-0 sudo[191475]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:03 compute-0 ceph-mon[75204]: pgmap v463: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v464: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:04 compute-0 sudo[191627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etnascsbqtaqrkbhmdjfksobhwtihcit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796923.6754053-511-105355905286028/AnsiballZ_file.py'
Dec 03 21:22:04 compute-0 sudo[191627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:04 compute-0 python3.9[191629]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:22:04 compute-0 sudo[191627]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:04 compute-0 sudo[191779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcrbjsisnyikxgibdkkfayerazrgcrgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796924.4226425-511-94845633072608/AnsiballZ_file.py'
Dec 03 21:22:04 compute-0 sudo[191779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:05 compute-0 python3.9[191781]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:22:05 compute-0 sudo[191779]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:05 compute-0 sudo[191931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajylwkputrldaikkjjljkyimtqafizrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796925.2164273-511-132271627719734/AnsiballZ_file.py'
Dec 03 21:22:05 compute-0 sudo[191931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:05 compute-0 python3.9[191933]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:22:05 compute-0 sudo[191931]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:05 compute-0 ceph-mon[75204]: pgmap v464: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v465: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:06 compute-0 sudo[192083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdoqyzexgojfpzdofssgngpjbtiofmjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796925.9704075-511-218352513283915/AnsiballZ_file.py'
Dec 03 21:22:06 compute-0 sudo[192083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:06 compute-0 python3.9[192085]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:22:06 compute-0 sudo[192083]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:07 compute-0 sudo[192235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcuosdlzjmsgksstoicnjfduyfhicxcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796926.7821617-511-2979998109716/AnsiballZ_file.py'
Dec 03 21:22:07 compute-0 sudo[192235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:07 compute-0 python3.9[192237]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:22:07 compute-0 sudo[192235]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:07 compute-0 ceph-mon[75204]: pgmap v465: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v466: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:08 compute-0 sudo[192387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxieewkasplnfdymwdjxvzamxrgqhpkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796927.668142-554-193661436375389/AnsiballZ_stat.py'
Dec 03 21:22:08 compute-0 sudo[192387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:08 compute-0 python3.9[192389]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:08 compute-0 sudo[192387]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:09 compute-0 sudo[192512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiqxsqdevomgwgfhpovwpciqtiexdlqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796927.668142-554-193661436375389/AnsiballZ_copy.py'
Dec 03 21:22:09 compute-0 sudo[192512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:09 compute-0 python3.9[192514]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764796927.668142-554-193661436375389/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:09 compute-0 sudo[192512]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:09 compute-0 ceph-mon[75204]: pgmap v466: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v467: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:09 compute-0 sudo[192664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhadruyddqzhozlbolnxopubatncrotz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796929.5827339-554-266300659577984/AnsiballZ_stat.py'
Dec 03 21:22:09 compute-0 sudo[192664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:10 compute-0 python3.9[192666]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:10 compute-0 sudo[192664]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:10 compute-0 sudo[192801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naqieygrufgokbhpsiwcoeljosjldril ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796929.5827339-554-266300659577984/AnsiballZ_copy.py'
Dec 03 21:22:10 compute-0 sudo[192801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:10 compute-0 podman[192763]: 2025-12-03 21:22:10.711780575 +0000 UTC m=+0.094309408 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec 03 21:22:10 compute-0 python3.9[192809]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764796929.5827339-554-266300659577984/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:10 compute-0 sudo[192801]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:11 compute-0 sudo[192966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abdzhilslvqxecwqhqorpmysesrjoubh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796931.040219-554-278248195346666/AnsiballZ_stat.py'
Dec 03 21:22:11 compute-0 sudo[192966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:11 compute-0 python3.9[192968]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:11 compute-0 sudo[192966]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:11 compute-0 ceph-mon[75204]: pgmap v467: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v468: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:12 compute-0 sudo[193091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbakgxobsjnnshtqqocyitvmiyniodfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796931.040219-554-278248195346666/AnsiballZ_copy.py'
Dec 03 21:22:12 compute-0 sudo[193091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:12 compute-0 python3.9[193093]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764796931.040219-554-278248195346666/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:12 compute-0 sudo[193091]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:13 compute-0 sudo[193243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-symatkfvcmtnkgzbcngokxphxxqzztuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796932.6429336-554-256132310814811/AnsiballZ_stat.py'
Dec 03 21:22:13 compute-0 sudo[193243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:13 compute-0 python3.9[193245]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:13 compute-0 sudo[193243]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:13 compute-0 sudo[193368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfrcvilsgzdiaoeohhxzyzouioibdmas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796932.6429336-554-256132310814811/AnsiballZ_copy.py'
Dec 03 21:22:13 compute-0 sudo[193368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:13 compute-0 ceph-mon[75204]: pgmap v468: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v469: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:14 compute-0 python3.9[193370]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764796932.6429336-554-256132310814811/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:14 compute-0 sudo[193368]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:14 compute-0 sudo[193520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmcefvoizujnzztpzepvbzdwytqnuiif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796934.209698-554-66459485172186/AnsiballZ_stat.py'
Dec 03 21:22:14 compute-0 sudo[193520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:14 compute-0 python3.9[193522]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:14 compute-0 sudo[193520]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:15 compute-0 ceph-mon[75204]: pgmap v469: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:15 compute-0 sudo[193645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swwwwucrdjdbmavlxzbgnvpykgblmpzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796934.209698-554-66459485172186/AnsiballZ_copy.py'
Dec 03 21:22:15 compute-0 sudo[193645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v470: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:15 compute-0 python3.9[193647]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764796934.209698-554-66459485172186/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:15 compute-0 sudo[193645]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:16 compute-0 sudo[193767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:22:16 compute-0 sudo[193767]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:16 compute-0 sudo[193767]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:16 compute-0 sudo[193826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxwtubzboikecwfbczvvbhmzofjafoba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796936.1799395-554-131257519434828/AnsiballZ_stat.py'
Dec 03 21:22:16 compute-0 sudo[193826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:16 compute-0 sudo[193820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:22:16 compute-0 sudo[193820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:16 compute-0 python3.9[193845]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:16 compute-0 sudo[193826]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:17 compute-0 sudo[194000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkxgglhyuvhstmlpmpqvcyejiicqtdti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796936.1799395-554-131257519434828/AnsiballZ_copy.py'
Dec 03 21:22:17 compute-0 sudo[194000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:17 compute-0 sudo[193820]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:22:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:22:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:22:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:22:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:22:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:22:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:22:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:22:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:22:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:22:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:22:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:22:17 compute-0 sudo[194007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:22:17 compute-0 sudo[194007]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:17 compute-0 sudo[194007]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:17 compute-0 ceph-mon[75204]: pgmap v470: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:22:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:22:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:22:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:22:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:22:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:22:17 compute-0 python3.9[194006]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764796936.1799395-554-131257519434828/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:17 compute-0 sudo[194000]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:17 compute-0 sudo[194032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:22:17 compute-0 sudo[194032]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:17 compute-0 podman[194146]: 2025-12-03 21:22:17.829690427 +0000 UTC m=+0.042190931 container create 1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_bartik, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:22:17 compute-0 systemd[1]: Started libpod-conmon-1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a.scope.
Dec 03 21:22:17 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:22:17 compute-0 podman[194146]: 2025-12-03 21:22:17.813451052 +0000 UTC m=+0.025951566 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:22:17 compute-0 podman[194146]: 2025-12-03 21:22:17.91973249 +0000 UTC m=+0.132233134 container init 1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_bartik, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:22:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v471: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:17 compute-0 podman[194146]: 2025-12-03 21:22:17.931724481 +0000 UTC m=+0.144224975 container start 1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_bartik, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:22:17 compute-0 podman[194146]: 2025-12-03 21:22:17.9350365 +0000 UTC m=+0.147537034 container attach 1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_bartik, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:22:17 compute-0 tender_bartik[194185]: 167 167
Dec 03 21:22:17 compute-0 systemd[1]: libpod-1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a.scope: Deactivated successfully.
Dec 03 21:22:17 compute-0 podman[194146]: 2025-12-03 21:22:17.940891137 +0000 UTC m=+0.153391681 container died 1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_bartik, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:22:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-bd3e02ddf8cf9205891c48d3b4abbe5d181cad98400f22a508daa4e8809bc6bb-merged.mount: Deactivated successfully.
Dec 03 21:22:17 compute-0 podman[194146]: 2025-12-03 21:22:17.991537154 +0000 UTC m=+0.204037688 container remove 1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=tender_bartik, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:22:18 compute-0 systemd[1]: libpod-conmon-1596133c9645282bdee465c891a2f000c8cdd0ad74da762446767d9533ddb92a.scope: Deactivated successfully.
Dec 03 21:22:18 compute-0 sudo[194254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzsttrujiytqfvenaawjtfgfwfvmmuve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796937.7007084-554-64762695299329/AnsiballZ_stat.py'
Dec 03 21:22:18 compute-0 sudo[194254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:18 compute-0 podman[194262]: 2025-12-03 21:22:18.224411274 +0000 UTC m=+0.072921785 container create d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_grothendieck, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:22:18 compute-0 python3.9[194256]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:18 compute-0 sudo[194254]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:18 compute-0 systemd[1]: Started libpod-conmon-d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82.scope.
Dec 03 21:22:18 compute-0 podman[194262]: 2025-12-03 21:22:18.195437197 +0000 UTC m=+0.043947778 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:22:18 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:22:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b2431858eecb2eb73468ef675f6a2a8a747b5c67bc4ae89aaf04cc2b9b1d5a6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b2431858eecb2eb73468ef675f6a2a8a747b5c67bc4ae89aaf04cc2b9b1d5a6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b2431858eecb2eb73468ef675f6a2a8a747b5c67bc4ae89aaf04cc2b9b1d5a6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b2431858eecb2eb73468ef675f6a2a8a747b5c67bc4ae89aaf04cc2b9b1d5a6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b2431858eecb2eb73468ef675f6a2a8a747b5c67bc4ae89aaf04cc2b9b1d5a6/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:18 compute-0 podman[194262]: 2025-12-03 21:22:18.314032806 +0000 UTC m=+0.162543387 container init d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_grothendieck, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 03 21:22:18 compute-0 podman[194262]: 2025-12-03 21:22:18.333730233 +0000 UTC m=+0.182240784 container start d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_grothendieck, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:22:18 compute-0 podman[194262]: 2025-12-03 21:22:18.337843563 +0000 UTC m=+0.186354114 container attach d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_grothendieck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:22:18 compute-0 sudo[194411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axuagywxbsxceqqnumlvuastczrxlhzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796937.7007084-554-64762695299329/AnsiballZ_copy.py'
Dec 03 21:22:18 compute-0 sudo[194411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:18 compute-0 python3.9[194413]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764796937.7007084-554-64762695299329/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:18 compute-0 sad_grothendieck[194279]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:22:18 compute-0 sad_grothendieck[194279]: --> All data devices are unavailable
Dec 03 21:22:18 compute-0 sudo[194411]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:18 compute-0 systemd[1]: libpod-d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82.scope: Deactivated successfully.
Dec 03 21:22:18 compute-0 podman[194262]: 2025-12-03 21:22:18.961780722 +0000 UTC m=+0.810291233 container died d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_grothendieck, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS)
Dec 03 21:22:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-3b2431858eecb2eb73468ef675f6a2a8a747b5c67bc4ae89aaf04cc2b9b1d5a6-merged.mount: Deactivated successfully.
Dec 03 21:22:19 compute-0 podman[194262]: 2025-12-03 21:22:19.008355931 +0000 UTC m=+0.856866462 container remove d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_grothendieck, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 03 21:22:19 compute-0 systemd[1]: libpod-conmon-d91ce80a930f149e1355879979aaefddf6828b18a4dc30516ff16bdefb058c82.scope: Deactivated successfully.
Dec 03 21:22:19 compute-0 sudo[194032]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:19 compute-0 sudo[194461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:22:19 compute-0 sudo[194461]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:19 compute-0 sudo[194461]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:19 compute-0 sudo[194527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:22:19 compute-0 sudo[194527]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:19 compute-0 sudo[194641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysaxggxzwqiwliirszvjlxnlystljrrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796939.0907476-554-99759697170507/AnsiballZ_stat.py'
Dec 03 21:22:19 compute-0 sudo[194641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:19 compute-0 podman[194645]: 2025-12-03 21:22:19.474186053 +0000 UTC m=+0.054540033 container create d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=festive_lamport, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:22:19 compute-0 ceph-mon[75204]: pgmap v471: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:19 compute-0 systemd[1]: Started libpod-conmon-d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf.scope.
Dec 03 21:22:19 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:22:19 compute-0 podman[194645]: 2025-12-03 21:22:19.446890741 +0000 UTC m=+0.027244781 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:22:19 compute-0 podman[194645]: 2025-12-03 21:22:19.554256889 +0000 UTC m=+0.134610869 container init d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=festive_lamport, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:22:19 compute-0 podman[194645]: 2025-12-03 21:22:19.5602875 +0000 UTC m=+0.140641460 container start d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=festive_lamport, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:22:19 compute-0 podman[194645]: 2025-12-03 21:22:19.563905887 +0000 UTC m=+0.144259877 container attach d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=festive_lamport, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:22:19 compute-0 festive_lamport[194663]: 167 167
Dec 03 21:22:19 compute-0 systemd[1]: libpod-d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf.scope: Deactivated successfully.
Dec 03 21:22:19 compute-0 conmon[194663]: conmon d00bb9642d0fc392aea2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf.scope/container/memory.events
Dec 03 21:22:19 compute-0 podman[194645]: 2025-12-03 21:22:19.566816075 +0000 UTC m=+0.147170035 container died d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=festive_lamport, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:22:19 compute-0 python3.9[194647]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-575d8cb8f6be56bda2f1e0d835fd17cedd2edaf049bddea5845a710c0568f0fe-merged.mount: Deactivated successfully.
Dec 03 21:22:19 compute-0 sudo[194641]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:19 compute-0 podman[194645]: 2025-12-03 21:22:19.610363092 +0000 UTC m=+0.190717052 container remove d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=festive_lamport, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:22:19 compute-0 systemd[1]: libpod-conmon-d00bb9642d0fc392aea27b0abcf7ba62e68830cfdfa4d8987c052d2125e7caaf.scope: Deactivated successfully.
Dec 03 21:22:19 compute-0 podman[194718]: 2025-12-03 21:22:19.806379695 +0000 UTC m=+0.067660545 container create c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_hermann, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:22:19 compute-0 systemd[1]: Started libpod-conmon-c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805.scope.
Dec 03 21:22:19 compute-0 podman[194718]: 2025-12-03 21:22:19.779017141 +0000 UTC m=+0.040298041 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:22:19 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:22:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e2e01379306566ac285a67c4bc63d0df766e14055221e9da744db967051ed8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e2e01379306566ac285a67c4bc63d0df766e14055221e9da744db967051ed8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e2e01379306566ac285a67c4bc63d0df766e14055221e9da744db967051ed8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e2e01379306566ac285a67c4bc63d0df766e14055221e9da744db967051ed8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:19 compute-0 podman[194718]: 2025-12-03 21:22:19.923509102 +0000 UTC m=+0.184789962 container init c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_hermann, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:22:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v472: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:19 compute-0 podman[194718]: 2025-12-03 21:22:19.940519818 +0000 UTC m=+0.201800668 container start c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_hermann, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:22:19 compute-0 podman[194718]: 2025-12-03 21:22:19.944964208 +0000 UTC m=+0.206245058 container attach c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_hermann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:22:20 compute-0 sudo[194828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebqyuptuwgdjvhcrcwyrtldcawoqfqrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796939.0907476-554-99759697170507/AnsiballZ_copy.py'
Dec 03 21:22:20 compute-0 sudo[194828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]: {
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:     "0": [
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:         {
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "devices": [
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "/dev/loop3"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             ],
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_name": "ceph_lv0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_size": "21470642176",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "name": "ceph_lv0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "tags": {
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cluster_name": "ceph",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.crush_device_class": "",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.encrypted": "0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.objectstore": "bluestore",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osd_id": "0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.type": "block",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.vdo": "0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.with_tpm": "0"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             },
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "type": "block",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "vg_name": "ceph_vg0"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:         }
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:     ],
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:     "1": [
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:         {
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "devices": [
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "/dev/loop4"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             ],
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_name": "ceph_lv1",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_size": "21470642176",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "name": "ceph_lv1",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "tags": {
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cluster_name": "ceph",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.crush_device_class": "",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.encrypted": "0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.objectstore": "bluestore",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osd_id": "1",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.type": "block",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.vdo": "0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.with_tpm": "0"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             },
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "type": "block",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "vg_name": "ceph_vg1"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:         }
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:     ],
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:     "2": [
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:         {
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "devices": [
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "/dev/loop5"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             ],
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_name": "ceph_lv2",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_size": "21470642176",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "name": "ceph_lv2",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "tags": {
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.cluster_name": "ceph",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.crush_device_class": "",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.encrypted": "0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.objectstore": "bluestore",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osd_id": "2",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.type": "block",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.vdo": "0",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:                 "ceph.with_tpm": "0"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             },
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "type": "block",
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:             "vg_name": "ceph_vg2"
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:         }
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]:     ]
Dec 03 21:22:20 compute-0 compassionate_hermann[194773]: }
Dec 03 21:22:20 compute-0 python3.9[194830]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764796939.0907476-554-99759697170507/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:20 compute-0 systemd[1]: libpod-c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805.scope: Deactivated successfully.
Dec 03 21:22:20 compute-0 podman[194718]: 2025-12-03 21:22:20.306234449 +0000 UTC m=+0.567515279 container died c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_hermann, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:22:20 compute-0 sudo[194828]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-47e2e01379306566ac285a67c4bc63d0df766e14055221e9da744db967051ed8-merged.mount: Deactivated successfully.
Dec 03 21:22:20 compute-0 podman[194718]: 2025-12-03 21:22:20.347096044 +0000 UTC m=+0.608376874 container remove c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_hermann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:22:20 compute-0 systemd[1]: libpod-conmon-c1b5cc1943ab2e35760ee83fefa3a1a423a92dc1437d3a78c33dd0911fb97805.scope: Deactivated successfully.
Dec 03 21:22:20 compute-0 sudo[194527]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:20 compute-0 sudo[194869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:22:20 compute-0 sudo[194869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:20 compute-0 sudo[194869]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:20 compute-0 sudo[194896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:22:20 compute-0 sudo[194896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:20 compute-0 podman[195029]: 2025-12-03 21:22:20.871168256 +0000 UTC m=+0.075216506 container create 9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=determined_yonath, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:22:20 compute-0 sudo[195069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbuyuaujepjlvpxkeoxzcmorxejkspwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796940.5165284-667-141462461081840/AnsiballZ_command.py'
Dec 03 21:22:20 compute-0 sudo[195069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:20 compute-0 systemd[1]: Started libpod-conmon-9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa.scope.
Dec 03 21:22:20 compute-0 podman[195029]: 2025-12-03 21:22:20.840074593 +0000 UTC m=+0.044122903 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:22:20 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:22:20 compute-0 podman[195029]: 2025-12-03 21:22:20.971815803 +0000 UTC m=+0.175864063 container init 9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=determined_yonath, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True)
Dec 03 21:22:20 compute-0 podman[195029]: 2025-12-03 21:22:20.982529729 +0000 UTC m=+0.186577949 container start 9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=determined_yonath, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:22:20 compute-0 podman[195029]: 2025-12-03 21:22:20.985488558 +0000 UTC m=+0.189536878 container attach 9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=determined_yonath, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, OSD_FLAVOR=default)
Dec 03 21:22:20 compute-0 systemd[1]: libpod-9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa.scope: Deactivated successfully.
Dec 03 21:22:20 compute-0 determined_yonath[195074]: 167 167
Dec 03 21:22:20 compute-0 conmon[195074]: conmon 9f6f16d8f0d6a4be873f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa.scope/container/memory.events
Dec 03 21:22:20 compute-0 podman[195029]: 2025-12-03 21:22:20.992323692 +0000 UTC m=+0.196371942 container died 9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=determined_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True)
Dec 03 21:22:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-93df7e46bddb3fc0b2d9d58a8411833f911bfba73ae62a816aa7160b36e5a059-merged.mount: Deactivated successfully.
Dec 03 21:22:21 compute-0 podman[195029]: 2025-12-03 21:22:21.032449157 +0000 UTC m=+0.236497387 container remove 9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=determined_yonath, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:22:21 compute-0 systemd[1]: libpod-conmon-9f6f16d8f0d6a4be873f400ce4af93c3a6f39cb8e5221849c6428ae769da5faa.scope: Deactivated successfully.
Dec 03 21:22:21 compute-0 python3.9[195071]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 03 21:22:21 compute-0 sudo[195069]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:21 compute-0 podman[195099]: 2025-12-03 21:22:21.189104425 +0000 UTC m=+0.034139446 container create 17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_swanson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:22:21
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['images', 'cephfs.cephfs.meta', 'volumes', '.mgr', 'vms', 'cephfs.cephfs.data', 'backups']
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:22:21 compute-0 systemd[1]: Started libpod-conmon-17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd.scope.
Dec 03 21:22:21 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:22:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cdfc30e8da3ebc004b8bc419cfba64cfc4fa379e8d1f814ae74d734d24bffa0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cdfc30e8da3ebc004b8bc419cfba64cfc4fa379e8d1f814ae74d734d24bffa0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cdfc30e8da3ebc004b8bc419cfba64cfc4fa379e8d1f814ae74d734d24bffa0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8cdfc30e8da3ebc004b8bc419cfba64cfc4fa379e8d1f814ae74d734d24bffa0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:22:21 compute-0 podman[195099]: 2025-12-03 21:22:21.17473417 +0000 UTC m=+0.019769221 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:22:21 compute-0 podman[195099]: 2025-12-03 21:22:21.276049764 +0000 UTC m=+0.121084865 container init 17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_swanson, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:22:21 compute-0 podman[195099]: 2025-12-03 21:22:21.287673806 +0000 UTC m=+0.132708867 container start 17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_swanson, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:22:21 compute-0 podman[195099]: 2025-12-03 21:22:21.291523579 +0000 UTC m=+0.136558640 container attach 17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_swanson, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle)
Dec 03 21:22:21 compute-0 ceph-mon[75204]: pgmap v472: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:21 compute-0 sudo[195297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllyrygsscjpebihwmclqwkitltwrafq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796941.4016442-676-206536529242532/AnsiballZ_file.py'
Dec 03 21:22:21 compute-0 sudo[195297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:22:21 compute-0 python3.9[195304]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v473: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:21 compute-0 sudo[195297]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:21 compute-0 lvm[195351]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:22:21 compute-0 lvm[195351]: VG ceph_vg1 finished
Dec 03 21:22:21 compute-0 lvm[195352]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:22:21 compute-0 lvm[195352]: VG ceph_vg0 finished
Dec 03 21:22:22 compute-0 lvm[195371]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:22:22 compute-0 lvm[195371]: VG ceph_vg2 finished
Dec 03 21:22:22 compute-0 gallant_swanson[195139]: {}
Dec 03 21:22:22 compute-0 systemd[1]: libpod-17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd.scope: Deactivated successfully.
Dec 03 21:22:22 compute-0 systemd[1]: libpod-17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd.scope: Consumed 1.374s CPU time.
Dec 03 21:22:22 compute-0 podman[195099]: 2025-12-03 21:22:22.176994617 +0000 UTC m=+1.022029678 container died 17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_swanson, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:22:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-8cdfc30e8da3ebc004b8bc419cfba64cfc4fa379e8d1f814ae74d734d24bffa0-merged.mount: Deactivated successfully.
Dec 03 21:22:22 compute-0 podman[195099]: 2025-12-03 21:22:22.229244046 +0000 UTC m=+1.074279067 container remove 17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_swanson, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:22:22 compute-0 systemd[1]: libpod-conmon-17a5edae3c68f0909f10cd3b12afdf438ba1b280bd6a956e1daf98e344716ffd.scope: Deactivated successfully.
Dec 03 21:22:22 compute-0 podman[195406]: 2025-12-03 21:22:22.265628951 +0000 UTC m=+0.056196177 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 03 21:22:22 compute-0 sudo[194896]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:22:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:22:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:22:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:22:22 compute-0 sudo[195479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:22:22 compute-0 sudo[195479]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:22:22 compute-0 sudo[195479]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:22 compute-0 sudo[195556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aivrqqnffipbdjgonygojqywhiesxzns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796942.145134-676-72360658283160/AnsiballZ_file.py'
Dec 03 21:22:22 compute-0 sudo[195556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:22 compute-0 python3.9[195558]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:22 compute-0 sudo[195556]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:23 compute-0 sudo[195708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaunrqpzxddryffyfyzsnxbqxpfivxdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796942.9272637-676-51762184696523/AnsiballZ_file.py'
Dec 03 21:22:23 compute-0 sudo[195708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:23 compute-0 ceph-mon[75204]: pgmap v473: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:22:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:22:23 compute-0 python3.9[195710]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:23 compute-0 sudo[195708]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v474: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:24 compute-0 sudo[195860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjdxglwhcgplzwtcbqciilpnpjnobsoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796943.7709005-676-260027557966283/AnsiballZ_file.py'
Dec 03 21:22:24 compute-0 sudo[195860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:24 compute-0 python3.9[195862]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:24 compute-0 sudo[195860]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:24 compute-0 sudo[196012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmdacovizbkotwueuwpjpsnmlusrpgfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796944.5158346-676-40890466316512/AnsiballZ_file.py'
Dec 03 21:22:24 compute-0 sudo[196012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:25 compute-0 python3.9[196014]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:25 compute-0 sudo[196012]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:25 compute-0 ceph-mon[75204]: pgmap v474: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:25 compute-0 sudo[196164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hewlvitbqfnnshacrwgpvjwyhgzyyoxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796945.2677495-676-225184268684591/AnsiballZ_file.py'
Dec 03 21:22:25 compute-0 sudo[196164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:25 compute-0 python3.9[196166]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:25 compute-0 sudo[196164]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v475: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:26 compute-0 sudo[196316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfeojytdghtxqoqjxxvkgofidklglvos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796945.952017-676-257806616307979/AnsiballZ_file.py'
Dec 03 21:22:26 compute-0 sudo[196316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:26 compute-0 python3.9[196318]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:26 compute-0 sudo[196316]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:27 compute-0 sudo[196468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcsjnntrrhhjunujfkqrduiucuhsiwnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796946.6891527-676-36170781946944/AnsiballZ_file.py'
Dec 03 21:22:27 compute-0 sudo[196468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:27 compute-0 ceph-mon[75204]: pgmap v475: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:27 compute-0 python3.9[196470]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:27 compute-0 sudo[196468]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:22:27 compute-0 sudo[196620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybwgeqknkfjpmqukcqidogrthakxkjrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796947.5138934-676-278401372018267/AnsiballZ_file.py'
Dec 03 21:22:27 compute-0 sudo[196620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v476: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:28 compute-0 python3.9[196622]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:28 compute-0 sudo[196620]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:28 compute-0 sudo[196772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpsrnoobkpdohvqnbyrcasjwewyqlxct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796948.2950473-676-18395715716838/AnsiballZ_file.py'
Dec 03 21:22:28 compute-0 sudo[196772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:28 compute-0 python3.9[196774]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:28 compute-0 sudo[196772]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:29 compute-0 ceph-mon[75204]: pgmap v476: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:29 compute-0 sudo[196924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zibpvuisekzkuvhznlludtqeqmkapmyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796949.0280895-676-244425325804131/AnsiballZ_file.py'
Dec 03 21:22:29 compute-0 sudo[196924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:29 compute-0 python3.9[196926]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:29 compute-0 sudo[196924]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v477: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:30 compute-0 sudo[197076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwnneiyvhgqvntzyphselnjutyhrjgrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796949.7536018-676-117536235923980/AnsiballZ_file.py'
Dec 03 21:22:30 compute-0 sudo[197076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:30 compute-0 python3.9[197078]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:30 compute-0 sudo[197076]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:30 compute-0 sudo[197228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmrvqvqaqganpxjrrdbxnzsanwhfivht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796950.5271268-676-235517158057585/AnsiballZ_file.py'
Dec 03 21:22:30 compute-0 sudo[197228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:31 compute-0 python3.9[197230]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:31 compute-0 sudo[197228]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:31 compute-0 ceph-mon[75204]: pgmap v477: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:31 compute-0 sudo[197380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmisodlmfsmuknaijtiuwewggmprlfxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796951.1983507-676-102982842484610/AnsiballZ_file.py'
Dec 03 21:22:31 compute-0 sudo[197380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:31 compute-0 python3.9[197382]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:31 compute-0 sudo[197380]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v478: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:32 compute-0 sudo[197532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkxgfcboysgyhuveagkkjufnxnabokjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796952.0588698-775-125735920412027/AnsiballZ_stat.py'
Dec 03 21:22:32 compute-0 sudo[197532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:32 compute-0 python3.9[197534]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:32 compute-0 sudo[197532]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:33 compute-0 sudo[197655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knarbubhcentyesgmkzjjfijjrzogmkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796952.0588698-775-125735920412027/AnsiballZ_copy.py'
Dec 03 21:22:33 compute-0 sudo[197655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:33 compute-0 ceph-mon[75204]: pgmap v478: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:33 compute-0 python3.9[197657]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796952.0588698-775-125735920412027/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:33 compute-0 sudo[197655]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v479: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:34 compute-0 sudo[197807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckvhavakynazqklujcspcxzsiqgidfus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796953.669058-775-113652748602020/AnsiballZ_stat.py'
Dec 03 21:22:34 compute-0 sudo[197807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:34 compute-0 python3.9[197809]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:34 compute-0 sudo[197807]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:34 compute-0 sudo[197930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgxgbxxfvcvoglfusgoubbmzjffqganu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796953.669058-775-113652748602020/AnsiballZ_copy.py'
Dec 03 21:22:34 compute-0 sudo[197930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:34 compute-0 python3.9[197932]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796953.669058-775-113652748602020/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:34 compute-0 sudo[197930]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:35 compute-0 ceph-mon[75204]: pgmap v479: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:35 compute-0 sudo[198082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qizenjqwncfhqnmxuekcwclikdfwubuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796955.0608366-775-8421491780627/AnsiballZ_stat.py'
Dec 03 21:22:35 compute-0 sudo[198082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:35 compute-0 python3.9[198084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:35 compute-0 sudo[198082]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v480: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:36 compute-0 sudo[198205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfwcniijgwmirjaaqcxmrdthqffntkjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796955.0608366-775-8421491780627/AnsiballZ_copy.py'
Dec 03 21:22:36 compute-0 sudo[198205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:36 compute-0 python3.9[198207]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796955.0608366-775-8421491780627/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:36 compute-0 sudo[198205]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:36 compute-0 sudo[198357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plpuajiuoxqglddkmmxwvjxoxafwckun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796956.3651183-775-91283292380072/AnsiballZ_stat.py'
Dec 03 21:22:36 compute-0 sudo[198357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:36 compute-0 python3.9[198359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:36 compute-0 sudo[198357]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:37 compute-0 ceph-mon[75204]: pgmap v480: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:37 compute-0 sudo[198480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhczmoknmfomjnwdvqdprzyiflouofgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796956.3651183-775-91283292380072/AnsiballZ_copy.py'
Dec 03 21:22:37 compute-0 sudo[198480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:37 compute-0 python3.9[198482]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796956.3651183-775-91283292380072/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:37 compute-0 sudo[198480]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v481: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:38 compute-0 sudo[198632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfaqqjlwynecbwuhjhjxseiyuekgeneo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796957.9022694-775-150774375380999/AnsiballZ_stat.py'
Dec 03 21:22:38 compute-0 sudo[198632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:38 compute-0 python3.9[198634]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:38 compute-0 sudo[198632]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:38 compute-0 sudo[198755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akjcxdkvnkpepzxyqbyzfbjmtmbagnvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796957.9022694-775-150774375380999/AnsiballZ_copy.py'
Dec 03 21:22:38 compute-0 sudo[198755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:39 compute-0 python3.9[198757]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796957.9022694-775-150774375380999/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:39 compute-0 sudo[198755]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:39 compute-0 ceph-mon[75204]: pgmap v481: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:39 compute-0 sudo[198907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhzuktpkzstylvqomkuhinynrujrfwma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796959.2957637-775-157251909379190/AnsiballZ_stat.py'
Dec 03 21:22:39 compute-0 sudo[198907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:39 compute-0 python3.9[198909]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:39 compute-0 sudo[198907]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v482: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:40 compute-0 sudo[199030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekupybqfzsmvuzdowjnijebnzplmxhal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796959.2957637-775-157251909379190/AnsiballZ_copy.py'
Dec 03 21:22:40 compute-0 sudo[199030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:40 compute-0 python3.9[199032]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796959.2957637-775-157251909379190/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:40 compute-0 sudo[199030]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:40 compute-0 sudo[199198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwmogirnsgmalzehjxjsdjqvxbqqhigy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796960.5886157-775-98184076052581/AnsiballZ_stat.py'
Dec 03 21:22:40 compute-0 sudo[199198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:40 compute-0 podman[199156]: 2025-12-03 21:22:40.997430559 +0000 UTC m=+0.133871649 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 03 21:22:41 compute-0 python3.9[199206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:41 compute-0 sudo[199198]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:41 compute-0 ceph-mon[75204]: pgmap v482: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:41 compute-0 sudo[199332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubegokchrgkbvxtguwefylbbqwpfvvbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796960.5886157-775-98184076052581/AnsiballZ_copy.py'
Dec 03 21:22:41 compute-0 sudo[199332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:41 compute-0 python3.9[199334]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796960.5886157-775-98184076052581/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:41 compute-0 sudo[199332]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v483: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:42 compute-0 sudo[199484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngjakkthkmyazrppgyyntnmlawfhqoid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796961.943828-775-108482491602721/AnsiballZ_stat.py'
Dec 03 21:22:42 compute-0 sudo[199484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:42 compute-0 python3.9[199486]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:42 compute-0 sudo[199484]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:43 compute-0 sudo[199607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpagevjlurvkdpnqgkxyugtaqdjnnyxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796961.943828-775-108482491602721/AnsiballZ_copy.py'
Dec 03 21:22:43 compute-0 sudo[199607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:43 compute-0 python3.9[199609]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796961.943828-775-108482491602721/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:43 compute-0 sudo[199607]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:43 compute-0 ceph-mon[75204]: pgmap v483: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:43 compute-0 sudo[199759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoczvemyzrwsxeqtfqrjekwohichjztw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796963.393386-775-204007469252364/AnsiballZ_stat.py'
Dec 03 21:22:43 compute-0 sudo[199759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:43 compute-0 python3.9[199761]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:43 compute-0 sudo[199759]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v484: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:44 compute-0 sudo[199882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hioiyjfusyyydwmxxzokeribpgyuwdfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796963.393386-775-204007469252364/AnsiballZ_copy.py'
Dec 03 21:22:44 compute-0 sudo[199882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:44 compute-0 python3.9[199884]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796963.393386-775-204007469252364/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:44 compute-0 sudo[199882]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:45 compute-0 sudo[200034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zznzfxljsbuzntnsisarkgmynjijjwkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796964.730089-775-161098405647775/AnsiballZ_stat.py'
Dec 03 21:22:45 compute-0 sudo[200034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:45 compute-0 python3.9[200036]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:45 compute-0 sudo[200034]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:45 compute-0 ceph-mon[75204]: pgmap v484: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:45 compute-0 sudo[200157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dumdhwdvbjimrglyxoqjjgwplbdehuhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796964.730089-775-161098405647775/AnsiballZ_copy.py'
Dec 03 21:22:45 compute-0 sudo[200157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:45 compute-0 python3.9[200159]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796964.730089-775-161098405647775/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v485: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:45 compute-0 sudo[200157]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:46 compute-0 sudo[200309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwhbmlwqbxvybrmtpqhaqhdpnoiukwqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796966.1547184-775-233027491023874/AnsiballZ_stat.py'
Dec 03 21:22:46 compute-0 sudo[200309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:46 compute-0 python3.9[200311]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:46 compute-0 sudo[200309]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:47 compute-0 sudo[200432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhdpbnwkagploxhljpiodrtdostkwqpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796966.1547184-775-233027491023874/AnsiballZ_copy.py'
Dec 03 21:22:47 compute-0 sudo[200432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:47 compute-0 python3.9[200434]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796966.1547184-775-233027491023874/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:47 compute-0 sudo[200432]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:47 compute-0 ceph-mon[75204]: pgmap v485: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:47 compute-0 sudo[200584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvzcdxdbxuuglnhdetpkvgkmjaxsbxhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796967.4964285-775-267235759890829/AnsiballZ_stat.py'
Dec 03 21:22:47 compute-0 sudo[200584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v486: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:48 compute-0 python3.9[200586]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:48 compute-0 sudo[200584]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:48 compute-0 sudo[200707]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcetnjjwvhsxslirbnmlkzrqresmuuvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796967.4964285-775-267235759890829/AnsiballZ_copy.py'
Dec 03 21:22:48 compute-0 sudo[200707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:48 compute-0 python3.9[200709]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796967.4964285-775-267235759890829/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:48 compute-0 sudo[200707]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:22:48.925 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:22:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:22:48.926 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:22:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:22:48.926 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:22:49 compute-0 sudo[200859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lblybsgqdbynkgqehvsmumsrbmnqvehn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796968.871055-775-191039360193399/AnsiballZ_stat.py'
Dec 03 21:22:49 compute-0 sudo[200859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:49 compute-0 python3.9[200861]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:49 compute-0 sudo[200859]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:49 compute-0 ceph-mon[75204]: pgmap v486: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:49 compute-0 sudo[200982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzomxszkqptsstmvscjnycmaipsifljh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796968.871055-775-191039360193399/AnsiballZ_copy.py'
Dec 03 21:22:49 compute-0 sudo[200982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v487: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:50 compute-0 python3.9[200984]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796968.871055-775-191039360193399/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:50 compute-0 sudo[200982]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:50 compute-0 sudo[201134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qewqjqiobhacobzlsclqefjwapjxyaqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796970.2417803-775-218587424752784/AnsiballZ_stat.py'
Dec 03 21:22:50 compute-0 sudo[201134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:50 compute-0 python3.9[201136]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:22:50 compute-0 sudo[201134]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:51 compute-0 sudo[201257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqdvlaangnehoqcsgrvmvcyvcuslrdwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796970.2417803-775-218587424752784/AnsiballZ_copy.py'
Dec 03 21:22:51 compute-0 sudo[201257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:51 compute-0 python3.9[201259]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764796970.2417803-775-218587424752784/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:51 compute-0 sudo[201257]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:22:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:22:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:22:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:22:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:22:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:22:51 compute-0 ceph-mon[75204]: pgmap v487: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v488: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:52 compute-0 python3.9[201409]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:22:53 compute-0 sudo[201574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwhjvzindqmugbgeolfzujwmpeurcqzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796972.4690123-981-88973132595373/AnsiballZ_seboolean.py'
Dec 03 21:22:53 compute-0 sudo[201574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:53 compute-0 podman[201536]: 2025-12-03 21:22:53.093946717 +0000 UTC m=+0.091909494 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 03 21:22:53 compute-0 python3.9[201581]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 03 21:22:53 compute-0 ceph-mon[75204]: pgmap v488: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v489: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:54 compute-0 sudo[201574]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:55 compute-0 sudo[201737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlhnjcflwlmemfmcaaaprgneokwkznxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796974.740357-989-182500337035118/AnsiballZ_copy.py'
Dec 03 21:22:55 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 03 21:22:55 compute-0 sudo[201737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:55 compute-0 python3.9[201739]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:55 compute-0 sudo[201737]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:22:55 compute-0 sudo[201889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjwzjdxfuxhrhcpojrmvapgswdjsrvgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796975.4699388-989-216794165769139/AnsiballZ_copy.py'
Dec 03 21:22:55 compute-0 sudo[201889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:55 compute-0 ceph-mon[75204]: pgmap v489: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v490: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:55 compute-0 python3.9[201891]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:56 compute-0 sudo[201889]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:56 compute-0 sudo[202041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztlzujtvaodipcvxtjqirkqoqnkvrdxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796976.1478398-989-209537085489467/AnsiballZ_copy.py'
Dec 03 21:22:56 compute-0 sudo[202041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:56 compute-0 python3.9[202043]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:56 compute-0 sudo[202041]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:57 compute-0 sudo[202193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nimuufvclksshwrhttijglvbaljontng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796976.8505642-989-117419985944904/AnsiballZ_copy.py'
Dec 03 21:22:57 compute-0 sudo[202193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:57 compute-0 python3.9[202195]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:57 compute-0 sudo[202193]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:57 compute-0 ceph-mon[75204]: pgmap v490: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:57 compute-0 sudo[202345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjqxscjutdjjeuckaeecdewymejmocfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796977.6634576-989-54817238367837/AnsiballZ_copy.py'
Dec 03 21:22:57 compute-0 sudo[202345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v491: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:58 compute-0 python3.9[202347]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:58 compute-0 sudo[202345]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:58 compute-0 sudo[202497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpzsewvjwgkkfwzkkcjrqkicnrduagom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796978.3886948-1025-51183416882662/AnsiballZ_copy.py'
Dec 03 21:22:58 compute-0 sudo[202497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:59 compute-0 python3.9[202499]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:59 compute-0 sudo[202497]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:59 compute-0 sudo[202649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcqsqavpebvqtljwzxyrwdndxggqxkle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796979.2336748-1025-138384997781998/AnsiballZ_copy.py'
Dec 03 21:22:59 compute-0 sudo[202649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:22:59 compute-0 python3.9[202651]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:22:59 compute-0 sudo[202649]: pam_unix(sudo:session): session closed for user root
Dec 03 21:22:59 compute-0 ceph-mon[75204]: pgmap v491: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:22:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v492: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:00 compute-0 sudo[202801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eycmzafcngmdotstewdnpswyiveheexy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796980.0391047-1025-273081464287613/AnsiballZ_copy.py'
Dec 03 21:23:00 compute-0 sudo[202801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:00 compute-0 python3.9[202803]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:00 compute-0 sudo[202801]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:01 compute-0 auditd[706]: Audit daemon rotating log files
Dec 03 21:23:01 compute-0 sudo[202953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxonsjzszvwofrynniqcppejzlkhyrwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796980.7960443-1025-10414218500269/AnsiballZ_copy.py'
Dec 03 21:23:01 compute-0 sudo[202953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:01 compute-0 python3.9[202955]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:01 compute-0 sudo[202953]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:01 compute-0 ceph-mon[75204]: pgmap v492: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v493: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:01 compute-0 sudo[203105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cixcywbhdnhkjnznnzipvcfjyfuszjgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796981.6025977-1025-262395341385021/AnsiballZ_copy.py'
Dec 03 21:23:01 compute-0 sudo[203105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:02 compute-0 python3.9[203107]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:02 compute-0 sudo[203105]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:02 compute-0 sudo[203257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clcejvvtbhkscejswdbghhjciavochnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796982.501559-1061-96580164442933/AnsiballZ_systemd.py'
Dec 03 21:23:02 compute-0 sudo[203257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:03 compute-0 python3.9[203259]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:23:03 compute-0 systemd[1]: Reloading.
Dec 03 21:23:03 compute-0 systemd-rc-local-generator[203287]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:23:03 compute-0 systemd-sysv-generator[203290]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:23:03 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Dec 03 21:23:03 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Dec 03 21:23:03 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 03 21:23:03 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 03 21:23:03 compute-0 systemd[1]: Starting libvirt logging daemon...
Dec 03 21:23:03 compute-0 systemd[1]: Started libvirt logging daemon.
Dec 03 21:23:03 compute-0 sudo[203257]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:03 compute-0 ceph-mon[75204]: pgmap v493: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v494: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:04 compute-0 sudo[203450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aowfcnugcwnlovuczmjafivnrwtyrdzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796984.0096788-1061-97731974425465/AnsiballZ_systemd.py'
Dec 03 21:23:04 compute-0 sudo[203450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:04 compute-0 python3.9[203452]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:23:04 compute-0 systemd[1]: Reloading.
Dec 03 21:23:04 compute-0 systemd-sysv-generator[203482]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:23:04 compute-0 systemd-rc-local-generator[203479]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:23:05 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 03 21:23:05 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 03 21:23:05 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 03 21:23:05 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 03 21:23:05 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 03 21:23:05 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 03 21:23:05 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 03 21:23:05 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 03 21:23:05 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 03 21:23:05 compute-0 sudo[203450]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:05 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 03 21:23:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:05 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 03 21:23:05 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 03 21:23:05 compute-0 sudo[203674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjepngaotuiduvzdvtcsxsysnxovbtvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796985.3362067-1061-162949825248566/AnsiballZ_systemd.py'
Dec 03 21:23:05 compute-0 sudo[203674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:05 compute-0 ceph-mon[75204]: pgmap v494: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v495: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:06 compute-0 python3.9[203676]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:23:06 compute-0 systemd[1]: Reloading.
Dec 03 21:23:06 compute-0 systemd-sysv-generator[203710]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:23:06 compute-0 systemd-rc-local-generator[203707]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:23:06 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 03 21:23:06 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 03 21:23:06 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 03 21:23:06 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 03 21:23:06 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 03 21:23:06 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 03 21:23:06 compute-0 setroubleshoot[203489]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 24592f25-fc0e-4d0c-8df9-da19c4f92acf
Dec 03 21:23:06 compute-0 setroubleshoot[203489]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 03 21:23:06 compute-0 setroubleshoot[203489]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 24592f25-fc0e-4d0c-8df9-da19c4f92acf
Dec 03 21:23:06 compute-0 sudo[203674]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:06 compute-0 setroubleshoot[203489]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 03 21:23:07 compute-0 sudo[203889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wirhzzgdazfzprfafhhbshuipvplgjjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796986.7712476-1061-132254925289659/AnsiballZ_systemd.py'
Dec 03 21:23:07 compute-0 sudo[203889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:07 compute-0 python3.9[203891]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:23:07 compute-0 systemd[1]: Reloading.
Dec 03 21:23:07 compute-0 systemd-sysv-generator[203920]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:23:07 compute-0 systemd-rc-local-generator[203915]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:23:07 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Dec 03 21:23:07 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 03 21:23:07 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 03 21:23:07 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 03 21:23:07 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 03 21:23:07 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 03 21:23:07 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 03 21:23:07 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 03 21:23:07 compute-0 ceph-mon[75204]: pgmap v495: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:07 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 03 21:23:07 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 03 21:23:07 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 03 21:23:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v496: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:07 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 03 21:23:08 compute-0 sudo[203889]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:08 compute-0 sudo[204104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrqooofztqufzbxfslpvyyuedjxuovlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796988.1071947-1061-9945974030020/AnsiballZ_systemd.py'
Dec 03 21:23:08 compute-0 sudo[204104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:08 compute-0 python3.9[204106]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:23:08 compute-0 systemd[1]: Reloading.
Dec 03 21:23:08 compute-0 systemd-rc-local-generator[204130]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:23:08 compute-0 systemd-sysv-generator[204137]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:23:09 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Dec 03 21:23:09 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Dec 03 21:23:09 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 03 21:23:09 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 03 21:23:09 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 03 21:23:09 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 03 21:23:09 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 03 21:23:09 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 03 21:23:09 compute-0 sudo[204104]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:09 compute-0 sudo[204317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvbsyuekxgombqhsjpgzxcalbuimtoaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796989.540193-1098-186599007546772/AnsiballZ_file.py'
Dec 03 21:23:09 compute-0 sudo[204317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:09 compute-0 ceph-mon[75204]: pgmap v496: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v497: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:10 compute-0 python3.9[204319]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:10 compute-0 sudo[204317]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:10 compute-0 sudo[204469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asxfunuuzdioqtwbnumgntkncjbuplhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796990.3844059-1106-236630230059354/AnsiballZ_find.py'
Dec 03 21:23:10 compute-0 sudo[204469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:10 compute-0 ceph-mon[75204]: pgmap v497: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:10 compute-0 python3.9[204471]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 03 21:23:10 compute-0 sudo[204469]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:11 compute-0 podman[204485]: 2025-12-03 21:23:11.200861189 +0000 UTC m=+0.127761589 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:23:11 compute-0 sudo[204648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoqsrxygssisxbhfecypeosztodngpwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796991.2154524-1114-160316809266059/AnsiballZ_command.py'
Dec 03 21:23:11 compute-0 sudo[204648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:11 compute-0 python3.9[204650]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:23:11 compute-0 sudo[204648]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v498: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:12 compute-0 python3.9[204804]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 03 21:23:12 compute-0 ceph-mon[75204]: pgmap v498: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:13 compute-0 python3.9[204954]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v499: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:14 compute-0 python3.9[205075]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764796993.0570912-1133-244076723217688/.source.xml follow=False _original_basename=secret.xml.j2 checksum=eb399b9585c91cefe0c954882ea59ab92b0cdac8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:15 compute-0 sudo[205225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkmuqsvpqgpgjxeuzbmhcaeupnswavzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796994.8923516-1148-52569406850362/AnsiballZ_command.py'
Dec 03 21:23:15 compute-0 sudo[205225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:15 compute-0 ceph-mon[75204]: pgmap v499: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:15 compute-0 python3.9[205227]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine c21de27e-a7fd-594b-8324-0697ba9aab3a
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:23:15 compute-0 polkitd[43470]: Registered Authentication Agent for unix-process:205229:293513 (system bus name :1.2477 [pkttyagent --process 205229 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 03 21:23:15 compute-0 polkitd[43470]: Unregistered Authentication Agent for unix-process:205229:293513 (system bus name :1.2477, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 03 21:23:15 compute-0 polkitd[43470]: Registered Authentication Agent for unix-process:205228:293513 (system bus name :1.2478 [pkttyagent --process 205228 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 03 21:23:15 compute-0 polkitd[43470]: Unregistered Authentication Agent for unix-process:205228:293513 (system bus name :1.2478, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 03 21:23:15 compute-0 sudo[205225]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v500: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:16 compute-0 python3.9[205389]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:16 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 03 21:23:16 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.030s CPU time.
Dec 03 21:23:16 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 03 21:23:17 compute-0 sudo[205539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlaworwfbhklyecdhfzsshnmwueaiiks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796996.675595-1164-108214648495927/AnsiballZ_command.py'
Dec 03 21:23:17 compute-0 sudo[205539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:17 compute-0 sudo[205539]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:17 compute-0 ceph-mon[75204]: pgmap v500: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:17 compute-0 sudo[205692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkfjmglwwpthtjffwqcvnrjakithphci ; FSID=c21de27e-a7fd-594b-8324-0697ba9aab3a KEY=AQB5pjBpAAAAABAAKWIHAEu4Fcpg9BW4WoYnAg== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796997.502209-1172-220378742319206/AnsiballZ_command.py'
Dec 03 21:23:17 compute-0 sudo[205692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v501: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:18 compute-0 polkitd[43470]: Registered Authentication Agent for unix-process:205695:293769 (system bus name :1.2481 [pkttyagent --process 205695 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 03 21:23:18 compute-0 polkitd[43470]: Unregistered Authentication Agent for unix-process:205695:293769 (system bus name :1.2481, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 03 21:23:19 compute-0 sudo[205692]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:19 compute-0 ceph-mon[75204]: pgmap v501: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:19 compute-0 sudo[205850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvvecgtsujudzxerwvblrflqvwlltglm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764796999.4581146-1180-8971575289102/AnsiballZ_copy.py'
Dec 03 21:23:19 compute-0 sudo[205850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v502: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:20 compute-0 python3.9[205852]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:20 compute-0 sudo[205850]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:20 compute-0 sudo[206002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atiaeqnvaiiyomatogexdgalypcvjzwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797000.3141425-1188-69356850246978/AnsiballZ_stat.py'
Dec 03 21:23:20 compute-0 sudo[206002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:20 compute-0 python3.9[206004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:20 compute-0 sudo[206002]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:23:21
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['.mgr', 'vms', 'images', 'volumes', 'cephfs.cephfs.meta', 'backups', 'cephfs.cephfs.data']
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:23:21 compute-0 sudo[206125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrisylmditlvhbhglfzlevqevxxtloav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797000.3141425-1188-69356850246978/AnsiballZ_copy.py'
Dec 03 21:23:21 compute-0 sudo[206125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:21 compute-0 ceph-mon[75204]: pgmap v502: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:21 compute-0 python3.9[206127]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797000.3141425-1188-69356850246978/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:21 compute-0 sudo[206125]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:23:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v503: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:22 compute-0 sudo[206300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paanjznzmyodgzvakypdyqqmrgxxldps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797002.054742-1204-49787616017585/AnsiballZ_file.py'
Dec 03 21:23:22 compute-0 sudo[206300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:22 compute-0 sudo[206257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:23:22 compute-0 sudo[206257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:22 compute-0 sudo[206257]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:22 compute-0 sudo[206305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:23:22 compute-0 sudo[206305]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:22 compute-0 python3.9[206302]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:22 compute-0 sudo[206300]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:23 compute-0 sudo[206305]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:23:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:23:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:23:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:23:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:23:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:23:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:23:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:23:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:23:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:23:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:23:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:23:23 compute-0 sudo[206519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwwcupctczlygqgkkujblxzbilieqsvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797002.8720016-1212-49023250365810/AnsiballZ_stat.py'
Dec 03 21:23:23 compute-0 sudo[206519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:23 compute-0 podman[206484]: 2025-12-03 21:23:23.26938265 +0000 UTC m=+0.065613892 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 03 21:23:23 compute-0 sudo[206524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:23:23 compute-0 sudo[206524]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:23 compute-0 sudo[206524]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:23 compute-0 sudo[206557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:23:23 compute-0 sudo[206557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:23 compute-0 python3.9[206533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:23 compute-0 sudo[206519]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:23 compute-0 ceph-mon[75204]: pgmap v503: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:23:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:23:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:23:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:23:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:23:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:23:23 compute-0 podman[206622]: 2025-12-03 21:23:23.687446511 +0000 UTC m=+0.048121469 container create 5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_diffie, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:23:23 compute-0 systemd[1]: Started libpod-conmon-5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270.scope.
Dec 03 21:23:23 compute-0 sudo[206684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whismncjlqqinlvsloinuaoukluodcfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797002.8720016-1212-49023250365810/AnsiballZ_file.py'
Dec 03 21:23:23 compute-0 sudo[206684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:23 compute-0 podman[206622]: 2025-12-03 21:23:23.670640948 +0000 UTC m=+0.031315946 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:23:23 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:23:23 compute-0 podman[206622]: 2025-12-03 21:23:23.800988155 +0000 UTC m=+0.161663143 container init 5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_diffie, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:23:23 compute-0 podman[206622]: 2025-12-03 21:23:23.808720964 +0000 UTC m=+0.169395922 container start 5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_diffie, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:23:23 compute-0 podman[206622]: 2025-12-03 21:23:23.812053474 +0000 UTC m=+0.172728472 container attach 5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_diffie, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:23:23 compute-0 competent_diffie[206686]: 167 167
Dec 03 21:23:23 compute-0 systemd[1]: libpod-5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270.scope: Deactivated successfully.
Dec 03 21:23:23 compute-0 podman[206622]: 2025-12-03 21:23:23.814862349 +0000 UTC m=+0.175537357 container died 5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_diffie, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:23:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-7e2cef015a2e8ca04c3403b220cdf6f888d697c3b9eb95acf8094c07c8719cf5-merged.mount: Deactivated successfully.
Dec 03 21:23:23 compute-0 podman[206622]: 2025-12-03 21:23:23.868090156 +0000 UTC m=+0.228765114 container remove 5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_diffie, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:23:23 compute-0 systemd[1]: libpod-conmon-5b9309714ec9d3591158c7e9fab166591a04bbfd61d2788acf31b09983515270.scope: Deactivated successfully.
Dec 03 21:23:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v504: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:23 compute-0 python3.9[206688]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:23 compute-0 sudo[206684]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:24 compute-0 podman[206711]: 2025-12-03 21:23:24.103053777 +0000 UTC m=+0.074162882 container create e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_matsumoto, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:23:24 compute-0 systemd[1]: Started libpod-conmon-e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c.scope.
Dec 03 21:23:24 compute-0 podman[206711]: 2025-12-03 21:23:24.071256439 +0000 UTC m=+0.042365594 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:23:24 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:23:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f332692e413ef3b8093cfc00b992e82812c800508cc7e8136f6c6f2914f586/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f332692e413ef3b8093cfc00b992e82812c800508cc7e8136f6c6f2914f586/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f332692e413ef3b8093cfc00b992e82812c800508cc7e8136f6c6f2914f586/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f332692e413ef3b8093cfc00b992e82812c800508cc7e8136f6c6f2914f586/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2f332692e413ef3b8093cfc00b992e82812c800508cc7e8136f6c6f2914f586/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:24 compute-0 podman[206711]: 2025-12-03 21:23:24.2173172 +0000 UTC m=+0.188426295 container init e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_matsumoto, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:23:24 compute-0 podman[206711]: 2025-12-03 21:23:24.235162872 +0000 UTC m=+0.206271977 container start e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_matsumoto, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:23:24 compute-0 podman[206711]: 2025-12-03 21:23:24.240202318 +0000 UTC m=+0.211311403 container attach e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_matsumoto, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:23:24 compute-0 sudo[206889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngwdvzffehiakrxbcpkubvqerqslcmpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797004.2333565-1224-133360818094609/AnsiballZ_stat.py'
Dec 03 21:23:24 compute-0 sudo[206889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:24 compute-0 awesome_matsumoto[206752]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:23:24 compute-0 awesome_matsumoto[206752]: --> All data devices are unavailable
Dec 03 21:23:24 compute-0 systemd[1]: libpod-e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c.scope: Deactivated successfully.
Dec 03 21:23:24 compute-0 podman[206711]: 2025-12-03 21:23:24.829955293 +0000 UTC m=+0.801064358 container died e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_matsumoto, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:23:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-b2f332692e413ef3b8093cfc00b992e82812c800508cc7e8136f6c6f2914f586-merged.mount: Deactivated successfully.
Dec 03 21:23:24 compute-0 podman[206711]: 2025-12-03 21:23:24.898863022 +0000 UTC m=+0.869972127 container remove e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_matsumoto, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:23:24 compute-0 python3.9[206891]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:24 compute-0 systemd[1]: libpod-conmon-e9aac3d9ebbe1f7442ce6f927913c9165650e2838b12108277d0653f9ef5cf0c.scope: Deactivated successfully.
Dec 03 21:23:24 compute-0 sudo[206557]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:24 compute-0 sudo[206889]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:24 compute-0 sudo[206915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:23:25 compute-0 sudo[206915]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:25 compute-0 sudo[206915]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:25 compute-0 sudo[206963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:23:25 compute-0 sudo[206963]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:25 compute-0 sudo[207038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylczptwcighvublibzuvtjhvsgjqielq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797004.2333565-1224-133360818094609/AnsiballZ_file.py'
Dec 03 21:23:25 compute-0 sudo[207038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:25 compute-0 python3.9[207040]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.76bi4mu5 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:25 compute-0 sudo[207038]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:25 compute-0 podman[207054]: 2025-12-03 21:23:25.430941211 +0000 UTC m=+0.062910188 container create 0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_lovelace, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:23:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:25 compute-0 systemd[1]: Started libpod-conmon-0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da.scope.
Dec 03 21:23:25 compute-0 podman[207054]: 2025-12-03 21:23:25.40829094 +0000 UTC m=+0.040259927 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:23:25 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:23:25 compute-0 podman[207054]: 2025-12-03 21:23:25.529720808 +0000 UTC m=+0.161689795 container init 0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_lovelace, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:23:25 compute-0 podman[207054]: 2025-12-03 21:23:25.542332267 +0000 UTC m=+0.174301234 container start 0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_lovelace, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle)
Dec 03 21:23:25 compute-0 podman[207054]: 2025-12-03 21:23:25.545933734 +0000 UTC m=+0.177902721 container attach 0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_lovelace, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:23:25 compute-0 infallible_lovelace[207087]: 167 167
Dec 03 21:23:25 compute-0 systemd[1]: libpod-0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da.scope: Deactivated successfully.
Dec 03 21:23:25 compute-0 podman[207054]: 2025-12-03 21:23:25.549975804 +0000 UTC m=+0.181944791 container died 0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:23:25 compute-0 ceph-mon[75204]: pgmap v504: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-89d8306c9ab86f21e9d7f2ffee11c455f66a8ff55e491246eaf457f0465bca13-merged.mount: Deactivated successfully.
Dec 03 21:23:25 compute-0 podman[207054]: 2025-12-03 21:23:25.600729674 +0000 UTC m=+0.232698631 container remove 0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_lovelace, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:23:25 compute-0 systemd[1]: libpod-conmon-0c6268004dffb7241b4d788f75e9036bf64d8d6db4a46fdd1f8be19748ecb9da.scope: Deactivated successfully.
Dec 03 21:23:25 compute-0 podman[207163]: 2025-12-03 21:23:25.812986412 +0000 UTC m=+0.060044392 container create 983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_yonath, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:23:25 compute-0 systemd[1]: Started libpod-conmon-983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa.scope.
Dec 03 21:23:25 compute-0 podman[207163]: 2025-12-03 21:23:25.782679284 +0000 UTC m=+0.029737314 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:23:25 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:23:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79224915a610e681cbfa332a46369a049c7238088ca88e71988cf29d01f20e4d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79224915a610e681cbfa332a46369a049c7238088ca88e71988cf29d01f20e4d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79224915a610e681cbfa332a46369a049c7238088ca88e71988cf29d01f20e4d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79224915a610e681cbfa332a46369a049c7238088ca88e71988cf29d01f20e4d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:25 compute-0 podman[207163]: 2025-12-03 21:23:25.905159199 +0000 UTC m=+0.152217229 container init 983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_yonath, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:23:25 compute-0 podman[207163]: 2025-12-03 21:23:25.922153998 +0000 UTC m=+0.169211968 container start 983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_yonath, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:23:25 compute-0 podman[207163]: 2025-12-03 21:23:25.925622401 +0000 UTC m=+0.172680441 container attach 983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_yonath, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:23:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v505: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:26 compute-0 sudo[207264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxjwuvadlmqvlqesphzikvzhbhcxgewu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797005.6269789-1236-118136427946881/AnsiballZ_stat.py'
Dec 03 21:23:26 compute-0 sudo[207264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:26 compute-0 python3.9[207266]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:26 compute-0 jovial_yonath[207209]: {
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:     "0": [
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:         {
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "devices": [
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "/dev/loop3"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             ],
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_name": "ceph_lv0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_size": "21470642176",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "name": "ceph_lv0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "tags": {
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cluster_name": "ceph",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.crush_device_class": "",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.encrypted": "0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.objectstore": "bluestore",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osd_id": "0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.type": "block",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.vdo": "0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.with_tpm": "0"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             },
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "type": "block",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "vg_name": "ceph_vg0"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:         }
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:     ],
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:     "1": [
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:         {
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "devices": [
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "/dev/loop4"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             ],
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_name": "ceph_lv1",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_size": "21470642176",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "name": "ceph_lv1",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "tags": {
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cluster_name": "ceph",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.crush_device_class": "",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.encrypted": "0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.objectstore": "bluestore",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osd_id": "1",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.type": "block",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.vdo": "0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.with_tpm": "0"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             },
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "type": "block",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "vg_name": "ceph_vg1"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:         }
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:     ],
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:     "2": [
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:         {
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "devices": [
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "/dev/loop5"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             ],
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_name": "ceph_lv2",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_size": "21470642176",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "name": "ceph_lv2",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "tags": {
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.cluster_name": "ceph",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.crush_device_class": "",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.encrypted": "0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.objectstore": "bluestore",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osd_id": "2",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.type": "block",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.vdo": "0",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:                 "ceph.with_tpm": "0"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             },
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "type": "block",
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:             "vg_name": "ceph_vg2"
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:         }
Dec 03 21:23:26 compute-0 jovial_yonath[207209]:     ]
Dec 03 21:23:26 compute-0 jovial_yonath[207209]: }
Dec 03 21:23:26 compute-0 sudo[207264]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:26 compute-0 systemd[1]: libpod-983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa.scope: Deactivated successfully.
Dec 03 21:23:26 compute-0 podman[207273]: 2025-12-03 21:23:26.356811967 +0000 UTC m=+0.040009111 container died 983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_yonath, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:23:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-79224915a610e681cbfa332a46369a049c7238088ca88e71988cf29d01f20e4d-merged.mount: Deactivated successfully.
Dec 03 21:23:26 compute-0 podman[207273]: 2025-12-03 21:23:26.406364424 +0000 UTC m=+0.089561558 container remove 983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_yonath, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:23:26 compute-0 systemd[1]: libpod-conmon-983abcff2e77e606016d68776cade8ea64cb9d796d4867c013ab6627599a62aa.scope: Deactivated successfully.
Dec 03 21:23:26 compute-0 sudo[206963]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:26 compute-0 sudo[207335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:23:26 compute-0 sudo[207384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vedkkctokiangpkemagtklcyalhbptvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797005.6269789-1236-118136427946881/AnsiballZ_file.py'
Dec 03 21:23:26 compute-0 sudo[207384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:26 compute-0 sudo[207335]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:26 compute-0 sudo[207335]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:26 compute-0 sudo[207389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:23:26 compute-0 sudo[207389]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:26 compute-0 python3.9[207387]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:26 compute-0 sudo[207384]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:27 compute-0 podman[207450]: 2025-12-03 21:23:27.033247221 +0000 UTC m=+0.066471934 container create 142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_shtern, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 03 21:23:27 compute-0 systemd[1]: Started libpod-conmon-142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5.scope.
Dec 03 21:23:27 compute-0 podman[207450]: 2025-12-03 21:23:27.003872509 +0000 UTC m=+0.037097282 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:23:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:23:27 compute-0 podman[207450]: 2025-12-03 21:23:27.115004607 +0000 UTC m=+0.148229310 container init 142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_shtern, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:23:27 compute-0 podman[207450]: 2025-12-03 21:23:27.121975986 +0000 UTC m=+0.155200679 container start 142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_shtern, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:23:27 compute-0 podman[207450]: 2025-12-03 21:23:27.124968836 +0000 UTC m=+0.158193529 container attach 142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_shtern, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2)
Dec 03 21:23:27 compute-0 objective_shtern[207512]: 167 167
Dec 03 21:23:27 compute-0 systemd[1]: libpod-142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5.scope: Deactivated successfully.
Dec 03 21:23:27 compute-0 podman[207450]: 2025-12-03 21:23:27.127276969 +0000 UTC m=+0.160501672 container died 142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_shtern, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:23:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-4c633a7b308c590406614c24f4fdba1ed6999b90fb872f1a00dbbd2fe059141b-merged.mount: Deactivated successfully.
Dec 03 21:23:27 compute-0 podman[207450]: 2025-12-03 21:23:27.171695318 +0000 UTC m=+0.204920011 container remove 142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=objective_shtern, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:23:27 compute-0 systemd[1]: libpod-conmon-142e6d34b327cd7fd07e4ba0fe4e05c0290fc60c009745841bd4fdb1906733e5.scope: Deactivated successfully.
Dec 03 21:23:27 compute-0 sudo[207613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvnqpfxnkaperwnrptbhfgjckvqakdbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797007.0137506-1249-69416909712595/AnsiballZ_command.py'
Dec 03 21:23:27 compute-0 sudo[207613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:27 compute-0 podman[207615]: 2025-12-03 21:23:27.370118251 +0000 UTC m=+0.057116641 container create d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_lalande, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle)
Dec 03 21:23:27 compute-0 systemd[1]: Started libpod-conmon-d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf.scope.
Dec 03 21:23:27 compute-0 podman[207615]: 2025-12-03 21:23:27.343888234 +0000 UTC m=+0.030886684 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:23:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:23:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11e009ee7eb74e0daaae94599661adc0e44348e177d6bda88d11ff8c452c7b7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11e009ee7eb74e0daaae94599661adc0e44348e177d6bda88d11ff8c452c7b7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11e009ee7eb74e0daaae94599661adc0e44348e177d6bda88d11ff8c452c7b7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11e009ee7eb74e0daaae94599661adc0e44348e177d6bda88d11ff8c452c7b7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:23:27 compute-0 podman[207615]: 2025-12-03 21:23:27.475169517 +0000 UTC m=+0.162167937 container init d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_lalande, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:23:27 compute-0 podman[207615]: 2025-12-03 21:23:27.488194229 +0000 UTC m=+0.175192619 container start d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_lalande, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:23:27 compute-0 podman[207615]: 2025-12-03 21:23:27.492180906 +0000 UTC m=+0.179179286 container attach d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_lalande, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:23:27 compute-0 python3.9[207628]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:23:27 compute-0 ceph-mon[75204]: pgmap v505: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:27 compute-0 sudo[207613]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:23:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v506: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:28 compute-0 lvm[207812]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:23:28 compute-0 lvm[207812]: VG ceph_vg0 finished
Dec 03 21:23:28 compute-0 lvm[207813]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:23:28 compute-0 lvm[207813]: VG ceph_vg1 finished
Dec 03 21:23:28 compute-0 lvm[207816]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:23:28 compute-0 lvm[207816]: VG ceph_vg2 finished
Dec 03 21:23:28 compute-0 lvm[207841]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:23:28 compute-0 lvm[207841]: VG ceph_vg2 finished
Dec 03 21:23:28 compute-0 dreamy_lalande[207634]: {}
Dec 03 21:23:28 compute-0 systemd[1]: libpod-d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf.scope: Deactivated successfully.
Dec 03 21:23:28 compute-0 systemd[1]: libpod-d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf.scope: Consumed 1.340s CPU time.
Dec 03 21:23:28 compute-0 podman[207615]: 2025-12-03 21:23:28.362553024 +0000 UTC m=+1.049551404 container died d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_lalande, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:23:28 compute-0 sudo[207869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghhvgxfmxknpnzjnxtffotxdahidktfb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764797007.8163776-1257-129740912141427/AnsiballZ_edpm_nftables_from_files.py'
Dec 03 21:23:28 compute-0 sudo[207869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-d11e009ee7eb74e0daaae94599661adc0e44348e177d6bda88d11ff8c452c7b7-merged.mount: Deactivated successfully.
Dec 03 21:23:28 compute-0 podman[207615]: 2025-12-03 21:23:28.418601097 +0000 UTC m=+1.105599457 container remove d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_lalande, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:23:28 compute-0 systemd[1]: libpod-conmon-d04fa279c88dd200b477ce4bfc0758d9e702a4382fc11fb056d67ac51cc0cdbf.scope: Deactivated successfully.
Dec 03 21:23:28 compute-0 sudo[207389]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:23:28 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:23:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:23:28 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:23:28 compute-0 sudo[207883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:23:28 compute-0 sudo[207883]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:23:28 compute-0 sudo[207883]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:28 compute-0 python3[207881]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 03 21:23:28 compute-0 sudo[207869]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:29 compute-0 sudo[208057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxutyqgpsfjdzinhhovevftpjqrgmuqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797008.8678257-1265-215740414386936/AnsiballZ_stat.py'
Dec 03 21:23:29 compute-0 sudo[208057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:29 compute-0 ceph-mon[75204]: pgmap v506: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:23:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:23:29 compute-0 python3.9[208059]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:29 compute-0 sudo[208057]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:29 compute-0 sudo[208135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iczactdirbebigggiyhxmkitkypuxgiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797008.8678257-1265-215740414386936/AnsiballZ_file.py'
Dec 03 21:23:29 compute-0 sudo[208135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v507: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:30 compute-0 python3.9[208137]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:30 compute-0 sudo[208135]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:30 compute-0 sudo[208287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqwaacdhmkckwomjrsieugjanyiocwpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797010.2565746-1277-191402120050160/AnsiballZ_stat.py'
Dec 03 21:23:30 compute-0 sudo[208287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:30 compute-0 python3.9[208289]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:30 compute-0 sudo[208287]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:31 compute-0 sudo[208365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhdfpcdkfgagqsftqtfdbhwbhebywzry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797010.2565746-1277-191402120050160/AnsiballZ_file.py'
Dec 03 21:23:31 compute-0 sudo[208365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:31 compute-0 python3.9[208367]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:31 compute-0 sudo[208365]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:31 compute-0 ceph-mon[75204]: pgmap v507: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v508: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:32 compute-0 sudo[208517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkghgiemumqtdajbtdoglgapbszdozdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797011.7165105-1289-211223826186908/AnsiballZ_stat.py'
Dec 03 21:23:32 compute-0 sudo[208517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:32 compute-0 python3.9[208519]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:32 compute-0 sudo[208517]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:32 compute-0 sudo[208595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyayhujzfxwqrhrtpwtoxtznogovkfzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797011.7165105-1289-211223826186908/AnsiballZ_file.py'
Dec 03 21:23:32 compute-0 sudo[208595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:32 compute-0 python3.9[208597]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:32 compute-0 sudo[208595]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:33 compute-0 sudo[208747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufgdbiykyriceyihpgtrepimpnxkxlzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797013.1946838-1301-255470699765261/AnsiballZ_stat.py'
Dec 03 21:23:33 compute-0 sudo[208747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:33 compute-0 ceph-mon[75204]: pgmap v508: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:33 compute-0 python3.9[208749]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:33 compute-0 sudo[208747]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v509: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:34 compute-0 sudo[208825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lklfneqziilvvdrsfjblwhiwkuwjfpve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797013.1946838-1301-255470699765261/AnsiballZ_file.py'
Dec 03 21:23:34 compute-0 sudo[208825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:34 compute-0 python3.9[208827]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:34 compute-0 sudo[208825]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:34 compute-0 sudo[208977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svjmwapgcaktynvcchpkkdaripowasmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797014.4775033-1313-154968413418526/AnsiballZ_stat.py'
Dec 03 21:23:34 compute-0 sudo[208977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:35 compute-0 python3.9[208979]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:35 compute-0 sudo[208977]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:35 compute-0 sudo[209102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuvvfwferrmqjcuqhzamwlcvdsqqywhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797014.4775033-1313-154968413418526/AnsiballZ_copy.py'
Dec 03 21:23:35 compute-0 sudo[209102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:35 compute-0 ceph-mon[75204]: pgmap v509: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:35 compute-0 python3.9[209104]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764797014.4775033-1313-154968413418526/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:35 compute-0 sudo[209102]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v510: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:36 compute-0 sudo[209254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwctaordtkfyeutechuvqdlxiyyfffiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797016.0932198-1328-137148601923578/AnsiballZ_file.py'
Dec 03 21:23:36 compute-0 sudo[209254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:36 compute-0 python3.9[209256]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:36 compute-0 sudo[209254]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:37 compute-0 sudo[209406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnaudojdyrbdjpqlgtifuandnzvyxpfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797017.0010593-1336-147514698437163/AnsiballZ_command.py'
Dec 03 21:23:37 compute-0 sudo[209406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:37 compute-0 python3.9[209408]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:23:37 compute-0 sudo[209406]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:37 compute-0 ceph-mon[75204]: pgmap v510: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v511: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:38 compute-0 sudo[209561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqisiypqjsdzntmefgovlocdiyhqtfxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797017.8308253-1344-213205690325219/AnsiballZ_blockinfile.py'
Dec 03 21:23:38 compute-0 sudo[209561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:38 compute-0 python3.9[209563]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:38 compute-0 sudo[209561]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:39 compute-0 sudo[209713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reuiiwjxlqnivwipythcoreobzzqgpvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797018.8043606-1353-56199316879989/AnsiballZ_command.py'
Dec 03 21:23:39 compute-0 sudo[209713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:39 compute-0 python3.9[209715]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:23:39 compute-0 sudo[209713]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:39 compute-0 ceph-mon[75204]: pgmap v511: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v512: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:40 compute-0 sudo[209866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zycoyswefrxcxzsvyazyoeeqyyczuspu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797019.6036155-1361-70501277298151/AnsiballZ_stat.py'
Dec 03 21:23:40 compute-0 sudo[209866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:40 compute-0 python3.9[209868]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:23:40 compute-0 sudo[209866]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:40 compute-0 sudo[210020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzsrpxssiituyisosspvaxenepyuioje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797020.4292467-1369-214508485358616/AnsiballZ_command.py'
Dec 03 21:23:40 compute-0 sudo[210020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:40 compute-0 python3.9[210022]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:23:41 compute-0 sudo[210020]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:41 compute-0 sudo[210191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhukciqkvfsqmafiplhquzunkyofuumk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797021.3107014-1377-125685172095447/AnsiballZ_file.py'
Dec 03 21:23:41 compute-0 sudo[210191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:41 compute-0 podman[210149]: 2025-12-03 21:23:41.764024556 +0000 UTC m=+0.130471482 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:23:41 compute-0 ceph-mon[75204]: pgmap v512: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:41 compute-0 python3.9[210196]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:41 compute-0 sudo[210191]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v513: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:42 compute-0 sudo[210353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiqecedbvbvzoaxjljxfzsfgderoeblu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797022.1774807-1385-148802998682543/AnsiballZ_stat.py'
Dec 03 21:23:42 compute-0 sudo[210353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:42 compute-0 python3.9[210355]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:42 compute-0 sudo[210353]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:43 compute-0 sudo[210476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlpcefueiymjdfenlfcjvsyamoabuyof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797022.1774807-1385-148802998682543/AnsiballZ_copy.py'
Dec 03 21:23:43 compute-0 sudo[210476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:43 compute-0 python3.9[210478]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797022.1774807-1385-148802998682543/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:43 compute-0 sudo[210476]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:43 compute-0 ceph-mon[75204]: pgmap v513: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v514: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:44 compute-0 sudo[210628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-patiunbbxcuyhpkkwjcuiwjscthrytux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797023.7467504-1400-156375556082131/AnsiballZ_stat.py'
Dec 03 21:23:44 compute-0 sudo[210628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:44 compute-0 python3.9[210630]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:44 compute-0 sudo[210628]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:44 compute-0 sudo[210751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbndfusavephdvmppixhrnazbneltqlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797023.7467504-1400-156375556082131/AnsiballZ_copy.py'
Dec 03 21:23:44 compute-0 sudo[210751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:45 compute-0 python3.9[210753]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797023.7467504-1400-156375556082131/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:45 compute-0 sudo[210751]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:45 compute-0 sudo[210903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uojmrdofkmugijmflzagatfuisuklqzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797025.3866394-1415-234285563632014/AnsiballZ_stat.py'
Dec 03 21:23:45 compute-0 sudo[210903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:45 compute-0 ceph-mon[75204]: pgmap v514: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v515: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:45 compute-0 python3.9[210905]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:23:45 compute-0 sudo[210903]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:46 compute-0 sudo[211026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqmoobromqmehsnlpzxziybadqpfexqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797025.3866394-1415-234285563632014/AnsiballZ_copy.py'
Dec 03 21:23:46 compute-0 sudo[211026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:46 compute-0 python3.9[211028]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797025.3866394-1415-234285563632014/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:23:46 compute-0 sudo[211026]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:47 compute-0 sudo[211178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgyyyiruewzinxwfywqdmkezkvncnqrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797026.8162334-1430-189751467909043/AnsiballZ_systemd.py'
Dec 03 21:23:47 compute-0 sudo[211178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:47 compute-0 python3.9[211180]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:23:47 compute-0 systemd[1]: Reloading.
Dec 03 21:23:47 compute-0 systemd-rc-local-generator[211204]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:23:47 compute-0 systemd-sysv-generator[211208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:23:47 compute-0 ceph-mon[75204]: pgmap v515: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:47 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Dec 03 21:23:47 compute-0 sudo[211178]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v516: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:48 compute-0 sudo[211368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjkxqzspadvacdasgvmibgvrchgheicf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797028.149077-1438-152927932282621/AnsiballZ_systemd.py'
Dec 03 21:23:48 compute-0 sudo[211368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:23:48 compute-0 python3.9[211370]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 03 21:23:48 compute-0 systemd[1]: Reloading.
Dec 03 21:23:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:23:48.926 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:23:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:23:48.928 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:23:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:23:48.928 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:23:48 compute-0 systemd-rc-local-generator[211398]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:23:48 compute-0 systemd-sysv-generator[211403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:23:49 compute-0 systemd[1]: Reloading.
Dec 03 21:23:49 compute-0 systemd-rc-local-generator[211435]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:23:49 compute-0 systemd-sysv-generator[211438]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:23:49 compute-0 sudo[211368]: pam_unix(sudo:session): session closed for user root
Dec 03 21:23:49 compute-0 ceph-mon[75204]: pgmap v516: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v517: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:49 compute-0 sshd-session[152133]: Connection closed by 192.168.122.30 port 45744
Dec 03 21:23:49 compute-0 sshd-session[152112]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:23:49 compute-0 systemd[1]: session-48.scope: Deactivated successfully.
Dec 03 21:23:49 compute-0 systemd[1]: session-48.scope: Consumed 4min 171ms CPU time.
Dec 03 21:23:49 compute-0 systemd-logind[787]: Session 48 logged out. Waiting for processes to exit.
Dec 03 21:23:49 compute-0 systemd-logind[787]: Removed session 48.
Dec 03 21:23:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:23:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:23:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:23:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:23:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:23:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:23:51 compute-0 ceph-mon[75204]: pgmap v517: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v518: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:53 compute-0 ceph-mon[75204]: pgmap v518: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #27. Immutable memtables: 0.
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.846071) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 27
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797033846176, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2044, "num_deletes": 251, "total_data_size": 2380796, "memory_usage": 2425616, "flush_reason": "Manual Compaction"}
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #28: started
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797033865437, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 28, "file_size": 2308951, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9052, "largest_seqno": 11095, "table_properties": {"data_size": 2299745, "index_size": 5828, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 17908, "raw_average_key_size": 19, "raw_value_size": 2281328, "raw_average_value_size": 2482, "num_data_blocks": 268, "num_entries": 919, "num_filter_entries": 919, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796798, "oldest_key_time": 1764796798, "file_creation_time": 1764797033, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 28, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 19520 microseconds, and 10596 cpu microseconds.
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.865620) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #28: 2308951 bytes OK
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.865669) [db/memtable_list.cc:519] [default] Level-0 commit table #28 started
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.867047) [db/memtable_list.cc:722] [default] Level-0 commit table #28: memtable #1 done
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.867061) EVENT_LOG_v1 {"time_micros": 1764797033867057, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.867079) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2372261, prev total WAL file size 2372261, number of live WAL files 2.
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000024.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.867913) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [28(2254KB)], [26(4750KB)]
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797033867969, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [28], "files_L6": [26], "score": -1, "input_data_size": 7173959, "oldest_snapshot_seqno": -1}
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #29: 3208 keys, 6017410 bytes, temperature: kUnknown
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797033904614, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 29, "file_size": 6017410, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5991407, "index_size": 16869, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8069, "raw_key_size": 74048, "raw_average_key_size": 23, "raw_value_size": 5929471, "raw_average_value_size": 1848, "num_data_blocks": 745, "num_entries": 3208, "num_filter_entries": 3208, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764797033, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.904933) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 6017410 bytes
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.906249) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.2 rd, 163.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 4.6 +0.0 blob) out(5.7 +0.0 blob), read-write-amplify(5.7) write-amplify(2.6) OK, records in: 3722, records dropped: 514 output_compression: NoCompression
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.906269) EVENT_LOG_v1 {"time_micros": 1764797033906259, "job": 10, "event": "compaction_finished", "compaction_time_micros": 36750, "compaction_time_cpu_micros": 12925, "output_level": 6, "num_output_files": 1, "total_output_size": 6017410, "num_input_records": 3722, "num_output_records": 3208, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000028.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797033906918, "job": 10, "event": "table_file_deletion", "file_number": 28}
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797033907995, "job": 10, "event": "table_file_deletion", "file_number": 26}
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.867821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.908066) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.908075) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.908077) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.908079) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:23:53 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:23:53.908081) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:23:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v519: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:54 compute-0 podman[211467]: 2025-12-03 21:23:54.113870178 +0000 UTC m=+0.057850992 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 03 21:23:55 compute-0 sshd-session[211486]: Accepted publickey for zuul from 192.168.122.30 port 55546 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:23:55 compute-0 systemd-logind[787]: New session 49 of user zuul.
Dec 03 21:23:55 compute-0 systemd[1]: Started Session 49 of User zuul.
Dec 03 21:23:55 compute-0 sshd-session[211486]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:23:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:23:55 compute-0 ceph-mon[75204]: pgmap v519: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v520: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:56 compute-0 python3.9[211639]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:23:57 compute-0 ceph-mon[75204]: pgmap v520: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v521: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:58 compute-0 python3.9[211793]: ansible-ansible.builtin.service_facts Invoked
Dec 03 21:23:58 compute-0 network[211810]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:23:58 compute-0 network[211811]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:23:58 compute-0 network[211812]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:23:59 compute-0 ceph-mon[75204]: pgmap v521: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:23:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v522: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v523: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:02 compute-0 ceph-mon[75204]: pgmap v522: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:03 compute-0 ceph-mon[75204]: pgmap v523: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v524: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:05 compute-0 sudo[212082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boinadsoiynyyyizhuexbamzcwluwmaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797044.6847034-47-264775950991413/AnsiballZ_setup.py'
Dec 03 21:24:05 compute-0 sudo[212082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:05 compute-0 ceph-mon[75204]: pgmap v524: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:05 compute-0 python3.9[212084]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 03 21:24:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:05 compute-0 sudo[212082]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v525: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:06 compute-0 sudo[212166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luhrlottgkhygybsmtfyosxdqriqndbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797044.6847034-47-264775950991413/AnsiballZ_dnf.py'
Dec 03 21:24:06 compute-0 sudo[212166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:06 compute-0 python3.9[212168]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:24:07 compute-0 ceph-mon[75204]: pgmap v525: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v526: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:09 compute-0 ceph-mon[75204]: pgmap v526: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v527: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:11 compute-0 ceph-mon[75204]: pgmap v527: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v528: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:12 compute-0 sudo[212166]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:12 compute-0 podman[212170]: 2025-12-03 21:24:12.22591636 +0000 UTC m=+0.144023143 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 03 21:24:12 compute-0 sudo[212345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwnoeeqgaeheopaohylmxotbnmuqeqbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797052.3058445-59-235634967458596/AnsiballZ_stat.py'
Dec 03 21:24:12 compute-0 sudo[212345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:13 compute-0 python3.9[212347]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:24:13 compute-0 sudo[212345]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:13 compute-0 ceph-mon[75204]: pgmap v528: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:13 compute-0 sudo[212497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tydoccvzncyrweyavwfdcqzrhujehatx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797053.4150062-69-206473765504198/AnsiballZ_command.py'
Dec 03 21:24:13 compute-0 sudo[212497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v529: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:14 compute-0 python3.9[212499]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:24:14 compute-0 sudo[212497]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:14 compute-0 sudo[212650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcaonegahqywalmwyfpkxqxgpvzmyuyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797054.5187879-79-178855543579128/AnsiballZ_stat.py'
Dec 03 21:24:14 compute-0 sudo[212650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:15 compute-0 python3.9[212652]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:24:15 compute-0 sudo[212650]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:15 compute-0 ceph-mon[75204]: pgmap v529: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:15 compute-0 sudo[212802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbmvquradunrwtkgeairtsrxagtwuyfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797055.305768-87-141943862295386/AnsiballZ_command.py'
Dec 03 21:24:15 compute-0 sudo[212802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v530: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:15 compute-0 python3.9[212804]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:24:16 compute-0 sudo[212802]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:16 compute-0 sudo[212955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epwywizhmcbfsdajwodohvispfqdyqnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797056.2348127-95-238511596842489/AnsiballZ_stat.py'
Dec 03 21:24:16 compute-0 sudo[212955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:16 compute-0 python3.9[212957]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:16 compute-0 sudo[212955]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:17 compute-0 ceph-mon[75204]: pgmap v530: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:17 compute-0 sudo[213078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nolntecomaoavcfrxibmkjsfkzebzkfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797056.2348127-95-238511596842489/AnsiballZ_copy.py'
Dec 03 21:24:17 compute-0 sudo[213078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:17 compute-0 python3.9[213080]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797056.2348127-95-238511596842489/.source.iscsi _original_basename=.nfh_tch2 follow=False checksum=9d1afd0835abf4d74a5804351287da0dac4ada05 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:17 compute-0 sudo[213078]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v531: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:18 compute-0 sudo[213230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crtcjsgcjswzsmhuhswskintyhfdfqkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797057.82683-110-73180981395304/AnsiballZ_file.py'
Dec 03 21:24:18 compute-0 sudo[213230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:18 compute-0 python3.9[213232]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:18 compute-0 sudo[213230]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:19 compute-0 ceph-mon[75204]: pgmap v531: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:19 compute-0 sudo[213382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyrirdgumisbbkljlsbjsmpabllxaiyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797058.7477064-118-278645114006362/AnsiballZ_lineinfile.py'
Dec 03 21:24:19 compute-0 sudo[213382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:19 compute-0 python3.9[213384]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:19 compute-0 sudo[213382]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v532: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:20 compute-0 sudo[213534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmionaefhmnujmrqlagjnqvkhyfitfzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797059.7457974-127-230728051255538/AnsiballZ_systemd_service.py'
Dec 03 21:24:20 compute-0 sudo[213534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:20 compute-0 python3.9[213536]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:24:20 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 03 21:24:20 compute-0 sudo[213534]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:24:21
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.meta', '.mgr', 'images', 'backups', 'cephfs.cephfs.data', 'volumes', 'vms']
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:24:21 compute-0 ceph-mon[75204]: pgmap v532: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:21 compute-0 sudo[213690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elnlteprndmfghcytydfxlhwobqxfifq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797061.1164627-135-237686289516958/AnsiballZ_systemd_service.py'
Dec 03 21:24:21 compute-0 sudo[213690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:24:21 compute-0 python3.9[213692]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:24:21 compute-0 systemd[1]: Reloading.
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:24:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v533: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:21 compute-0 systemd-rc-local-generator[213721]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:24:21 compute-0 systemd-sysv-generator[213724]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:24:22 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 03 21:24:22 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 03 21:24:22 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Dec 03 21:24:22 compute-0 systemd[1]: Started Open-iSCSI.
Dec 03 21:24:22 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 03 21:24:22 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 03 21:24:22 compute-0 sudo[213690]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:23 compute-0 sudo[213891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tslavwwvfanakvqoeycjonofgirzjgrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797062.6990864-146-177893050571755/AnsiballZ_service_facts.py'
Dec 03 21:24:23 compute-0 sudo[213891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:23 compute-0 python3.9[213893]: ansible-ansible.builtin.service_facts Invoked
Dec 03 21:24:23 compute-0 ceph-mon[75204]: pgmap v533: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:23 compute-0 network[213910]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:24:23 compute-0 network[213911]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:24:23 compute-0 network[213912]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:24:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v534: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:24 compute-0 podman[213918]: 2025-12-03 21:24:24.410231732 +0000 UTC m=+0.097675310 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:24:25 compute-0 ceph-mon[75204]: pgmap v534: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v535: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:27 compute-0 ceph-mon[75204]: pgmap v535: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:27 compute-0 sudo[213891]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:24:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v536: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:28 compute-0 sudo[214201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejtusucyhblrvnblwnazukkuugtphihr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797067.8304331-156-6506441527457/AnsiballZ_file.py'
Dec 03 21:24:28 compute-0 sudo[214201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:28 compute-0 python3.9[214203]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 03 21:24:28 compute-0 sudo[214201]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:28 compute-0 sudo[214250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:24:28 compute-0 sudo[214250]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:28 compute-0 sudo[214250]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:28 compute-0 sudo[214299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:24:28 compute-0 sudo[214299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:29 compute-0 sudo[214417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcurcofgroxsvxswkqmfmzpjyywggeaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797068.6070936-164-278388129734128/AnsiballZ_modprobe.py'
Dec 03 21:24:29 compute-0 sudo[214417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:29 compute-0 python3.9[214419]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 03 21:24:29 compute-0 sudo[214417]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:29 compute-0 ceph-mon[75204]: pgmap v536: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:29 compute-0 sudo[214299]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:24:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:24:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:24:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:24:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:24:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:24:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:24:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:24:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:24:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:24:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:24:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:24:29 compute-0 sudo[214465]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:24:29 compute-0 sudo[214465]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:29 compute-0 sudo[214465]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:29 compute-0 sudo[214513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:24:29 compute-0 sudo[214513]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:29 compute-0 sudo[214641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvnhopmyeywkfnvvbzicuoluzxjnrzoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797069.5216115-172-16157622866746/AnsiballZ_stat.py'
Dec 03 21:24:29 compute-0 sudo[214641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:29 compute-0 podman[214655]: 2025-12-03 21:24:29.823828568 +0000 UTC m=+0.035202234 container create 2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_pascal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:24:29 compute-0 systemd[1]: Started libpod-conmon-2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05.scope.
Dec 03 21:24:29 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:24:29 compute-0 podman[214655]: 2025-12-03 21:24:29.90072888 +0000 UTC m=+0.112102576 container init 2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_pascal, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:24:29 compute-0 podman[214655]: 2025-12-03 21:24:29.809932866 +0000 UTC m=+0.021306552 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:24:29 compute-0 podman[214655]: 2025-12-03 21:24:29.910314107 +0000 UTC m=+0.121687793 container start 2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_pascal, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:24:29 compute-0 podman[214655]: 2025-12-03 21:24:29.914317244 +0000 UTC m=+0.125690930 container attach 2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_pascal, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:24:29 compute-0 nifty_pascal[214671]: 167 167
Dec 03 21:24:29 compute-0 systemd[1]: libpod-2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05.scope: Deactivated successfully.
Dec 03 21:24:29 compute-0 conmon[214671]: conmon 2700e846cb0e746b5ffb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05.scope/container/memory.events
Dec 03 21:24:29 compute-0 podman[214655]: 2025-12-03 21:24:29.91790385 +0000 UTC m=+0.129277526 container died 2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_pascal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:24:29 compute-0 python3.9[214649]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:29 compute-0 sudo[214641]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-9184fec759360b935444954038a2257d11c5832a3f4335a686732fdc38766f64-merged.mount: Deactivated successfully.
Dec 03 21:24:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v537: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:29 compute-0 podman[214655]: 2025-12-03 21:24:29.985824721 +0000 UTC m=+0.197198387 container remove 2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_pascal, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:24:29 compute-0 systemd[1]: libpod-conmon-2700e846cb0e746b5ffba3266810219a091f54642673dcfaebefe24e60bf0d05.scope: Deactivated successfully.
Dec 03 21:24:30 compute-0 podman[214743]: 2025-12-03 21:24:30.155079878 +0000 UTC m=+0.040121087 container create ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_ellis, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:24:30 compute-0 systemd[1]: Started libpod-conmon-ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323.scope.
Dec 03 21:24:30 compute-0 podman[214743]: 2025-12-03 21:24:30.135824432 +0000 UTC m=+0.020865731 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:24:30 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:24:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c14ffa41a42faca11da06325e461702d72714a994021796cbe7f7047bfdf80c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c14ffa41a42faca11da06325e461702d72714a994021796cbe7f7047bfdf80c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c14ffa41a42faca11da06325e461702d72714a994021796cbe7f7047bfdf80c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c14ffa41a42faca11da06325e461702d72714a994021796cbe7f7047bfdf80c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c14ffa41a42faca11da06325e461702d72714a994021796cbe7f7047bfdf80c/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:30 compute-0 podman[214743]: 2025-12-03 21:24:30.25848218 +0000 UTC m=+0.143523479 container init ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_ellis, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:24:30 compute-0 podman[214743]: 2025-12-03 21:24:30.273629036 +0000 UTC m=+0.158670255 container start ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_ellis, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:24:30 compute-0 podman[214743]: 2025-12-03 21:24:30.277348075 +0000 UTC m=+0.162389374 container attach ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_ellis, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:24:30 compute-0 sudo[214837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqwxpahmlvrtjhxcszxekifpxnovlmrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797069.5216115-172-16157622866746/AnsiballZ_copy.py'
Dec 03 21:24:30 compute-0 sudo[214837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:30 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:24:30 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:24:30 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:24:30 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:24:30 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:24:30 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:24:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:30 compute-0 python3.9[214839]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797069.5216115-172-16157622866746/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:30 compute-0 sudo[214837]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:30 compute-0 elated_ellis[214784]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:24:30 compute-0 elated_ellis[214784]: --> All data devices are unavailable
Dec 03 21:24:30 compute-0 systemd[1]: libpod-ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323.scope: Deactivated successfully.
Dec 03 21:24:30 compute-0 podman[214743]: 2025-12-03 21:24:30.889926766 +0000 UTC m=+0.774968005 container died ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_ellis, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True)
Dec 03 21:24:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-9c14ffa41a42faca11da06325e461702d72714a994021796cbe7f7047bfdf80c-merged.mount: Deactivated successfully.
Dec 03 21:24:30 compute-0 podman[214743]: 2025-12-03 21:24:30.949502793 +0000 UTC m=+0.834544042 container remove ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_ellis, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 21:24:30 compute-0 systemd[1]: libpod-conmon-ddeb1be2d42fa07bf3a64af65a0ad876c4706c98fc15478861e537af07000323.scope: Deactivated successfully.
Dec 03 21:24:30 compute-0 sudo[214513]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:31 compute-0 sudo[214944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:24:31 compute-0 sudo[214944]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:31 compute-0 sudo[214944]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:31 compute-0 sudo[214992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:24:31 compute-0 sudo[214992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:31 compute-0 sudo[215067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xklmbscjholvllwtcmwltejvvrbfqery ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797070.9081273-188-90420721399708/AnsiballZ_lineinfile.py'
Dec 03 21:24:31 compute-0 sudo[215067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:31 compute-0 podman[215084]: 2025-12-03 21:24:31.439060126 +0000 UTC m=+0.048416488 container create f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_mahavira, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:24:31 compute-0 python3.9[215069]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:31 compute-0 ceph-mon[75204]: pgmap v537: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:31 compute-0 systemd[1]: Started libpod-conmon-f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec.scope.
Dec 03 21:24:31 compute-0 sudo[215067]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:31 compute-0 podman[215084]: 2025-12-03 21:24:31.416781069 +0000 UTC m=+0.026137461 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:24:31 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:24:31 compute-0 podman[215084]: 2025-12-03 21:24:31.534260148 +0000 UTC m=+0.143616550 container init f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_mahavira, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:24:31 compute-0 podman[215084]: 2025-12-03 21:24:31.542094659 +0000 UTC m=+0.151451021 container start f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_mahavira, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:24:31 compute-0 practical_mahavira[215101]: 167 167
Dec 03 21:24:31 compute-0 podman[215084]: 2025-12-03 21:24:31.54625984 +0000 UTC m=+0.155616202 container attach f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_mahavira, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:24:31 compute-0 systemd[1]: libpod-f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec.scope: Deactivated successfully.
Dec 03 21:24:31 compute-0 podman[215084]: 2025-12-03 21:24:31.547877984 +0000 UTC m=+0.157234316 container died f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_mahavira, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:24:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-e25b0c7b7c67dbdfb92d04122d56fc41309920fc0c6ad183e69e090cf4ee6ec2-merged.mount: Deactivated successfully.
Dec 03 21:24:31 compute-0 podman[215084]: 2025-12-03 21:24:31.591922024 +0000 UTC m=+0.201278356 container remove f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_mahavira, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:24:31 compute-0 systemd[1]: libpod-conmon-f87bfdbf7e7421a701c0824b394f1527a6b9444c8e5a2ec094b28ca1f95abfec.scope: Deactivated successfully.
Dec 03 21:24:31 compute-0 podman[215172]: 2025-12-03 21:24:31.767464349 +0000 UTC m=+0.041944415 container create f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_lalande, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:24:31 compute-0 systemd[1]: Started libpod-conmon-f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd.scope.
Dec 03 21:24:31 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:24:31 compute-0 podman[215172]: 2025-12-03 21:24:31.749672803 +0000 UTC m=+0.024152889 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:24:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ffe2c4f160ba56d1b0e3e05a45f5170843345699830f772f2d14823fee7959/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ffe2c4f160ba56d1b0e3e05a45f5170843345699830f772f2d14823fee7959/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ffe2c4f160ba56d1b0e3e05a45f5170843345699830f772f2d14823fee7959/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55ffe2c4f160ba56d1b0e3e05a45f5170843345699830f772f2d14823fee7959/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:31 compute-0 podman[215172]: 2025-12-03 21:24:31.86709311 +0000 UTC m=+0.141573266 container init f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_lalande, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:24:31 compute-0 podman[215172]: 2025-12-03 21:24:31.88724187 +0000 UTC m=+0.161721946 container start f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_lalande, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:24:31 compute-0 podman[215172]: 2025-12-03 21:24:31.892321307 +0000 UTC m=+0.166801413 container attach f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_lalande, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True)
Dec 03 21:24:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v538: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]: {
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:     "0": [
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:         {
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "devices": [
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "/dev/loop3"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             ],
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_name": "ceph_lv0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_size": "21470642176",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "name": "ceph_lv0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "tags": {
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cluster_name": "ceph",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.crush_device_class": "",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.encrypted": "0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.objectstore": "bluestore",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osd_id": "0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.type": "block",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.vdo": "0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.with_tpm": "0"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             },
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "type": "block",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "vg_name": "ceph_vg0"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:         }
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:     ],
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:     "1": [
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:         {
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "devices": [
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "/dev/loop4"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             ],
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_name": "ceph_lv1",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_size": "21470642176",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "name": "ceph_lv1",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "tags": {
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cluster_name": "ceph",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.crush_device_class": "",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.encrypted": "0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.objectstore": "bluestore",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osd_id": "1",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.type": "block",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.vdo": "0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.with_tpm": "0"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             },
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "type": "block",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "vg_name": "ceph_vg1"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:         }
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:     ],
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:     "2": [
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:         {
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "devices": [
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "/dev/loop5"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             ],
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_name": "ceph_lv2",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_size": "21470642176",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "name": "ceph_lv2",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "tags": {
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.cluster_name": "ceph",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.crush_device_class": "",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.encrypted": "0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.objectstore": "bluestore",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osd_id": "2",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.type": "block",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.vdo": "0",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:                 "ceph.with_tpm": "0"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             },
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "type": "block",
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:             "vg_name": "ceph_vg2"
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:         }
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]:     ]
Dec 03 21:24:32 compute-0 xenodochial_lalande[215218]: }
Dec 03 21:24:32 compute-0 systemd[1]: libpod-f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd.scope: Deactivated successfully.
Dec 03 21:24:32 compute-0 podman[215172]: 2025-12-03 21:24:32.327854081 +0000 UTC m=+0.602334207 container died f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_lalande, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 03 21:24:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-55ffe2c4f160ba56d1b0e3e05a45f5170843345699830f772f2d14823fee7959-merged.mount: Deactivated successfully.
Dec 03 21:24:32 compute-0 podman[215172]: 2025-12-03 21:24:32.385894327 +0000 UTC m=+0.660374403 container remove f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=xenodochial_lalande, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:24:32 compute-0 systemd[1]: libpod-conmon-f8de85093a0c5fb583fbd89587b2cb31d158126a8e95a31448bc1ef9d24e05dd.scope: Deactivated successfully.
Dec 03 21:24:32 compute-0 sudo[214992]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:32 compute-0 sudo[215313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpwwsrqikdcjfckptfqaxgsdzlzerrid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797071.7046201-196-1385379365265/AnsiballZ_systemd.py'
Dec 03 21:24:32 compute-0 sudo[215313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:32 compute-0 sudo[215315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:24:32 compute-0 sudo[215315]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:32 compute-0 sudo[215315]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:32 compute-0 sudo[215341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:24:32 compute-0 sudo[215341]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:32 compute-0 python3.9[215321]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:24:32 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 03 21:24:32 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 03 21:24:32 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 03 21:24:32 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 03 21:24:32 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 03 21:24:32 compute-0 sudo[215313]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:32 compute-0 podman[215381]: 2025-12-03 21:24:32.922206223 +0000 UTC m=+0.054591454 container create c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_franklin, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:24:32 compute-0 systemd[1]: Started libpod-conmon-c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9.scope.
Dec 03 21:24:32 compute-0 podman[215381]: 2025-12-03 21:24:32.898247971 +0000 UTC m=+0.030633232 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:24:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:24:33 compute-0 podman[215381]: 2025-12-03 21:24:33.025239985 +0000 UTC m=+0.157625226 container init c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_franklin, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:24:33 compute-0 podman[215381]: 2025-12-03 21:24:33.039697783 +0000 UTC m=+0.172083044 container start c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_franklin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:24:33 compute-0 podman[215381]: 2025-12-03 21:24:33.044154752 +0000 UTC m=+0.176540023 container attach c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_franklin, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:24:33 compute-0 sad_franklin[215407]: 167 167
Dec 03 21:24:33 compute-0 systemd[1]: libpod-c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9.scope: Deactivated successfully.
Dec 03 21:24:33 compute-0 podman[215381]: 2025-12-03 21:24:33.050128463 +0000 UTC m=+0.182513724 container died c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_franklin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.build-date=20251030)
Dec 03 21:24:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-68a0c11c64e9b02087e15a2887daa7c0520c594e175e5ba2b54182511342c0ff-merged.mount: Deactivated successfully.
Dec 03 21:24:33 compute-0 podman[215381]: 2025-12-03 21:24:33.091956944 +0000 UTC m=+0.224342195 container remove c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_franklin, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:24:33 compute-0 systemd[1]: libpod-conmon-c190232fc00af2bc1147af138bde527fe09482963908d923b9e57469b29e93a9.scope: Deactivated successfully.
Dec 03 21:24:33 compute-0 podman[215500]: 2025-12-03 21:24:33.317535561 +0000 UTC m=+0.066732570 container create 690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_williams, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:24:33 compute-0 systemd[1]: Started libpod-conmon-690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9.scope.
Dec 03 21:24:33 compute-0 podman[215500]: 2025-12-03 21:24:33.291149483 +0000 UTC m=+0.040346582 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:24:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:24:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c24e5beff9d131a5da1e6812912f084dd81da2a5bc7b6348c5b20c87814489e8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c24e5beff9d131a5da1e6812912f084dd81da2a5bc7b6348c5b20c87814489e8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c24e5beff9d131a5da1e6812912f084dd81da2a5bc7b6348c5b20c87814489e8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c24e5beff9d131a5da1e6812912f084dd81da2a5bc7b6348c5b20c87814489e8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:24:33 compute-0 podman[215500]: 2025-12-03 21:24:33.428748982 +0000 UTC m=+0.177946101 container init 690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_williams, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:24:33 compute-0 podman[215500]: 2025-12-03 21:24:33.448145902 +0000 UTC m=+0.197342951 container start 690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_williams, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2)
Dec 03 21:24:33 compute-0 podman[215500]: 2025-12-03 21:24:33.452737065 +0000 UTC m=+0.201934104 container attach 690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_williams, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3)
Dec 03 21:24:33 compute-0 ceph-mon[75204]: pgmap v538: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:33 compute-0 sudo[215593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knkktdwjgktabhwikjkcivirgvhzdfaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797073.1183445-204-106977270669341/AnsiballZ_file.py'
Dec 03 21:24:33 compute-0 sudo[215593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:33 compute-0 python3.9[215595]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:24:33 compute-0 sudo[215593]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v539: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:34 compute-0 lvm[215722]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:24:34 compute-0 lvm[215722]: VG ceph_vg0 finished
Dec 03 21:24:34 compute-0 lvm[215725]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:24:34 compute-0 lvm[215725]: VG ceph_vg1 finished
Dec 03 21:24:34 compute-0 lvm[215731]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:24:34 compute-0 lvm[215731]: VG ceph_vg2 finished
Dec 03 21:24:34 compute-0 lvm[215747]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:24:34 compute-0 lvm[215747]: VG ceph_vg0 finished
Dec 03 21:24:34 compute-0 lvm[215751]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:24:34 compute-0 lvm[215751]: VG ceph_vg2 finished
Dec 03 21:24:34 compute-0 elegant_williams[215561]: {}
Dec 03 21:24:34 compute-0 lvm[215760]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:24:34 compute-0 lvm[215760]: VG ceph_vg2 finished
Dec 03 21:24:34 compute-0 systemd[1]: libpod-690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9.scope: Deactivated successfully.
Dec 03 21:24:34 compute-0 systemd[1]: libpod-690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9.scope: Consumed 1.397s CPU time.
Dec 03 21:24:34 compute-0 podman[215500]: 2025-12-03 21:24:34.302323669 +0000 UTC m=+1.051520688 container died 690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_williams, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:24:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-c24e5beff9d131a5da1e6812912f084dd81da2a5bc7b6348c5b20c87814489e8-merged.mount: Deactivated successfully.
Dec 03 21:24:34 compute-0 podman[215500]: 2025-12-03 21:24:34.355544836 +0000 UTC m=+1.104741845 container remove 690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elegant_williams, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default)
Dec 03 21:24:34 compute-0 systemd[1]: libpod-conmon-690f1e9ed32baf7c551787536e186d06d97832dc3ea908af675e59af5eabedd9.scope: Deactivated successfully.
Dec 03 21:24:34 compute-0 sudo[215341]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:24:34 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:24:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:24:34 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:24:34 compute-0 sudo[215823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:24:34 compute-0 sudo[215863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnvgyeqmppyugdnppkhcygqcdhiynoqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797074.1023598-213-111661502551439/AnsiballZ_stat.py'
Dec 03 21:24:34 compute-0 sudo[215823]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:24:34 compute-0 sudo[215863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:34 compute-0 sudo[215823]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:34 compute-0 python3.9[215866]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:24:34 compute-0 sudo[215863]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:35 compute-0 sudo[216017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpayvxhmilorlubgwdxcbqvfkibijjvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797074.919094-222-211986605335789/AnsiballZ_stat.py'
Dec 03 21:24:35 compute-0 sudo[216017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:35 compute-0 ceph-mon[75204]: pgmap v539: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:24:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:24:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:35 compute-0 python3.9[216019]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:24:35 compute-0 sudo[216017]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v540: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:36 compute-0 sudo[216169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuivttmxibeujtgpeepkylvtoggggpou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797075.7428403-230-184252749930175/AnsiballZ_stat.py'
Dec 03 21:24:36 compute-0 sudo[216169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:36 compute-0 python3.9[216171]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:36 compute-0 sudo[216169]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:36 compute-0 sudo[216292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytpyguqnnuygcozlsfwpemzwqnrnhqcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797075.7428403-230-184252749930175/AnsiballZ_copy.py'
Dec 03 21:24:36 compute-0 sudo[216292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:36 compute-0 python3.9[216294]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797075.7428403-230-184252749930175/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:36 compute-0 sudo[216292]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:37 compute-0 ceph-mon[75204]: pgmap v540: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:37 compute-0 sudo[216444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvfvgooblgblvhjwpzwmbxykgytplnlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797077.1243253-245-42229955930323/AnsiballZ_command.py'
Dec 03 21:24:37 compute-0 sudo[216444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:37 compute-0 python3.9[216446]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:24:37 compute-0 sudo[216444]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:37 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v541: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:38 compute-0 sudo[216597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omjqsbszbbwfotrzzhvidjbgmutfbuoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797077.9538448-253-173040954536873/AnsiballZ_lineinfile.py'
Dec 03 21:24:38 compute-0 sudo[216597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:38 compute-0 python3.9[216599]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:38 compute-0 sudo[216597]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:39 compute-0 ceph-mon[75204]: pgmap v541: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:39 compute-0 sudo[216749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocyybaletcivpzigftyhytwewgijzhkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797078.8372655-261-173415430148304/AnsiballZ_replace.py'
Dec 03 21:24:39 compute-0 sudo[216749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:39 compute-0 python3.9[216751]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:39 compute-0 sudo[216749]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:39 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v542: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:40 compute-0 sudo[216901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjqtmjdfwnlcizqekhoegezsosdscoej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797079.9636264-269-277392789387810/AnsiballZ_replace.py'
Dec 03 21:24:40 compute-0 sudo[216901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:40 compute-0 python3.9[216903]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:40 compute-0 sudo[216901]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:41 compute-0 sudo[217053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovqufgjrqraqklrxriogzqlssngvilkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797080.8675044-278-88045225122978/AnsiballZ_lineinfile.py'
Dec 03 21:24:41 compute-0 sudo[217053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:41 compute-0 python3.9[217055]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:41 compute-0 ceph-mon[75204]: pgmap v542: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:41 compute-0 sudo[217053]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:41 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v543: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:42 compute-0 sudo[217205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezwewbbtfleftzjybrphfecqupsaikin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797081.6761339-278-65104053937493/AnsiballZ_lineinfile.py'
Dec 03 21:24:42 compute-0 sudo[217205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:42 compute-0 python3.9[217207]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:42 compute-0 sudo[217205]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:42 compute-0 sudo[217367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkmmqvopfeqqmetaqbqvgypwymyuqgtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797082.4658177-278-67439306536518/AnsiballZ_lineinfile.py'
Dec 03 21:24:42 compute-0 sudo[217367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:42 compute-0 podman[217331]: 2025-12-03 21:24:42.933241219 +0000 UTC m=+0.159348162 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 03 21:24:43 compute-0 python3.9[217373]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:43 compute-0 sudo[217367]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:43 compute-0 ceph-mon[75204]: pgmap v543: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:43 compute-0 sudo[217533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubxqggyaxjhozcqbicvdlslrwjjybhxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797083.2608736-278-88481817224172/AnsiballZ_lineinfile.py'
Dec 03 21:24:43 compute-0 sudo[217533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:43 compute-0 python3.9[217535]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:43 compute-0 sudo[217533]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:43 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v544: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:44 compute-0 sudo[217685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyxquiwhliuxynidqemtzuermcuqgfhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797083.990227-307-187649311717794/AnsiballZ_stat.py'
Dec 03 21:24:44 compute-0 sudo[217685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:44 compute-0 python3.9[217687]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:24:44 compute-0 sudo[217685]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:45 compute-0 sudo[217839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbfodhmplocwsqdniohegujgwwduvitw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797084.7401335-315-134910889853086/AnsiballZ_file.py'
Dec 03 21:24:45 compute-0 sudo[217839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:45 compute-0 python3.9[217841]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:45 compute-0 sudo[217839]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:45 compute-0 ceph-mon[75204]: pgmap v544: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:45 compute-0 sudo[217991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myqlwsezpsgmqttxqqmiiwggfenitzlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797085.5550735-324-81748500671378/AnsiballZ_file.py'
Dec 03 21:24:45 compute-0 sudo[217991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:45 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v545: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:46 compute-0 python3.9[217993]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:24:46 compute-0 sudo[217991]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:46 compute-0 sudo[218143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djnnwqffazvkwpigoydbztsiqiujrpsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797086.4246101-332-91498257502019/AnsiballZ_stat.py'
Dec 03 21:24:46 compute-0 sudo[218143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:46 compute-0 python3.9[218145]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:47 compute-0 sudo[218143]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:47 compute-0 sudo[218221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnkxnwgiwfeopecbjzwmcfioapexfweg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797086.4246101-332-91498257502019/AnsiballZ_file.py'
Dec 03 21:24:47 compute-0 sudo[218221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:47 compute-0 python3.9[218223]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:24:47 compute-0 sudo[218221]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:47 compute-0 ceph-mon[75204]: pgmap v545: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:47 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v546: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:48 compute-0 sudo[218373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtpnoproouimcqoxvlhuizovgkiatyli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797087.654568-332-216521334469791/AnsiballZ_stat.py'
Dec 03 21:24:48 compute-0 sudo[218373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:48 compute-0 python3.9[218375]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:48 compute-0 sudo[218373]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:48 compute-0 sudo[218451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksodxikwgzlqxuonxepirnystxthbarp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797087.654568-332-216521334469791/AnsiballZ_file.py'
Dec 03 21:24:48 compute-0 sudo[218451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:48 compute-0 python3.9[218453]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:24:48 compute-0 sudo[218451]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:24:48.927 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:24:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:24:48.927 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:24:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:24:48.928 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:24:49 compute-0 sudo[218603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwkdalzmpddqhllvodscjfttnjanpait ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797089.0004225-355-227762065317182/AnsiballZ_file.py'
Dec 03 21:24:49 compute-0 sudo[218603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:49 compute-0 python3.9[218605]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:49 compute-0 sudo[218603]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:49 compute-0 ceph-mon[75204]: pgmap v546: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:49 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v547: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:50 compute-0 sudo[218755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmalmfbyqdzkfrztonxwtqwlbsccsiia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797089.6335804-363-265503545386903/AnsiballZ_stat.py'
Dec 03 21:24:50 compute-0 sudo[218755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:50 compute-0 python3.9[218757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:50 compute-0 sudo[218755]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:50 compute-0 sudo[218833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwnorjczoqehbbcdlvnskdhdyqbvtkbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797089.6335804-363-265503545386903/AnsiballZ_file.py'
Dec 03 21:24:50 compute-0 sudo[218833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:50 compute-0 python3.9[218835]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:50 compute-0 sudo[218833]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:51 compute-0 sudo[218985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffeurjolarexpecnrphyxkpohbqxsdrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797090.937972-375-176457773076326/AnsiballZ_stat.py'
Dec 03 21:24:51 compute-0 sudo[218985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:51 compute-0 python3.9[218987]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:51 compute-0 sudo[218985]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:51 compute-0 ceph-mon[75204]: pgmap v547: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:24:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:24:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:24:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:24:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:24:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:24:51 compute-0 sudo[219063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdyuqwnygxpwfsukfcgeberggksnbref ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797090.937972-375-176457773076326/AnsiballZ_file.py'
Dec 03 21:24:51 compute-0 sudo[219063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:51 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v548: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:52 compute-0 python3.9[219065]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:52 compute-0 sudo[219063]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:52 compute-0 sudo[219215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-turcjqahboqbmvfgofkvtzydybgynzgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797092.2445858-387-104490788597970/AnsiballZ_systemd.py'
Dec 03 21:24:52 compute-0 sudo[219215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:52 compute-0 python3.9[219217]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:24:53 compute-0 systemd[1]: Reloading.
Dec 03 21:24:53 compute-0 systemd-rc-local-generator[219238]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:24:53 compute-0 systemd-sysv-generator[219243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:24:53 compute-0 sudo[219215]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:53 compute-0 ceph-mon[75204]: pgmap v548: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:53 compute-0 sudo[219404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkzvvkigqoenxwwaptzofvjjbhprxhpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797093.6545599-395-22305384205359/AnsiballZ_stat.py'
Dec 03 21:24:53 compute-0 sudo[219404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:53 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v549: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:54 compute-0 python3.9[219406]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:54 compute-0 sudo[219404]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:54 compute-0 sudo[219482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dihhixgwaehmclhwncwgcktyduwaudkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797093.6545599-395-22305384205359/AnsiballZ_file.py'
Dec 03 21:24:54 compute-0 sudo[219482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:54 compute-0 podman[219484]: 2025-12-03 21:24:54.579267802 +0000 UTC m=+0.063071971 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 03 21:24:54 compute-0 python3.9[219485]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:54 compute-0 sudo[219482]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:55 compute-0 sudo[219653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klapxcthiffpehslzenctohxccpaevhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797094.9007933-407-174725628645111/AnsiballZ_stat.py'
Dec 03 21:24:55 compute-0 sudo[219653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:55 compute-0 python3.9[219655]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:55 compute-0 sudo[219653]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:24:55 compute-0 ceph-mon[75204]: pgmap v549: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:55 compute-0 sudo[219731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvqpxfxwylizsjexbajlgmcoqjnewniz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797094.9007933-407-174725628645111/AnsiballZ_file.py'
Dec 03 21:24:55 compute-0 sudo[219731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:55 compute-0 python3.9[219733]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:24:55 compute-0 sudo[219731]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:55 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v550: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:56 compute-0 sudo[219883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwldcwaiopigcoermqdnbnkfpciefouy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797096.0514023-419-260937059678956/AnsiballZ_systemd.py'
Dec 03 21:24:56 compute-0 sudo[219883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:56 compute-0 python3.9[219885]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:24:56 compute-0 systemd[1]: Reloading.
Dec 03 21:24:56 compute-0 systemd-rc-local-generator[219914]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:24:56 compute-0 systemd-sysv-generator[219917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:24:57 compute-0 systemd[1]: Starting Create netns directory...
Dec 03 21:24:57 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 03 21:24:57 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 03 21:24:57 compute-0 systemd[1]: Finished Create netns directory.
Dec 03 21:24:57 compute-0 sudo[219883]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:57 compute-0 ceph-mon[75204]: pgmap v550: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:57 compute-0 sudo[220076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvrbtcunyubuagiqfbxzhwageyxbdoyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797097.5373516-429-19556027316740/AnsiballZ_file.py'
Dec 03 21:24:57 compute-0 sudo[220076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:57 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v551: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:58 compute-0 python3.9[220078]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:24:58 compute-0 sudo[220076]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:58 compute-0 sudo[220228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yserowrxkefbxwpnmlsluoweovucxbka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797098.2170405-437-85516893330854/AnsiballZ_stat.py'
Dec 03 21:24:58 compute-0 sudo[220228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:58 compute-0 python3.9[220230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:24:58 compute-0 sudo[220228]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:59 compute-0 sudo[220351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efrsgjuokwoaamdiheieunulggnfhijf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797098.2170405-437-85516893330854/AnsiballZ_copy.py'
Dec 03 21:24:59 compute-0 sudo[220351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:24:59 compute-0 python3.9[220353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764797098.2170405-437-85516893330854/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:24:59 compute-0 sudo[220351]: pam_unix(sudo:session): session closed for user root
Dec 03 21:24:59 compute-0 ceph-mon[75204]: pgmap v551: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:24:59 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v552: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:00 compute-0 sudo[220503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqxqfalgxxovyfeayoumzreqwpyyhmpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797099.8689475-454-50649366610183/AnsiballZ_file.py'
Dec 03 21:25:00 compute-0 sudo[220503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:00 compute-0 python3.9[220505]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:25:00 compute-0 sudo[220503]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:00 compute-0 sudo[220655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwrhfvlargiuffidkzjcbfswuefvawpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797100.623288-462-128579134738981/AnsiballZ_stat.py'
Dec 03 21:25:00 compute-0 sudo[220655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:01 compute-0 python3.9[220657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:25:01 compute-0 sudo[220655]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:01 compute-0 sudo[220778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmgcnhwuxcndlywxdeqepttdupwisano ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797100.623288-462-128579134738981/AnsiballZ_copy.py'
Dec 03 21:25:01 compute-0 sudo[220778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:01 compute-0 ceph-mon[75204]: pgmap v552: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:01 compute-0 python3.9[220780]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797100.623288-462-128579134738981/.source.json _original_basename=.n3arkdfy follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:01 compute-0 sudo[220778]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:01 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v553: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:02 compute-0 sudo[220930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvywftyzbgvjcuxjcmxzsrnaaasfltkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797102.0592117-477-136243500269796/AnsiballZ_file.py'
Dec 03 21:25:02 compute-0 sudo[220930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:02 compute-0 python3.9[220932]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:02 compute-0 sudo[220930]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:03 compute-0 sudo[221082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwncavjgmtkbtgkcakyhjwmlfugpwwxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797102.8799129-485-102303827205253/AnsiballZ_stat.py'
Dec 03 21:25:03 compute-0 sudo[221082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:03 compute-0 sudo[221082]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:03 compute-0 ceph-mon[75204]: pgmap v553: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:03 compute-0 sudo[221205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlxfjxfcgvkgjkxgltcxjgjihocmnqof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797102.8799129-485-102303827205253/AnsiballZ_copy.py'
Dec 03 21:25:03 compute-0 sudo[221205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:03 compute-0 sudo[221205]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:03 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v554: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:04 compute-0 sudo[221357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awusnnmfewzplbinqewaiptjagrwwhyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797104.2241986-502-130231994381241/AnsiballZ_container_config_data.py'
Dec 03 21:25:04 compute-0 sudo[221357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:04 compute-0 python3.9[221359]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 03 21:25:04 compute-0 sudo[221357]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:05 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 03 21:25:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:05 compute-0 ceph-mon[75204]: pgmap v554: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:05 compute-0 sudo[221510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzryysdsfjbkhvmmnfigllxqqiharoce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797105.201703-511-267301082141549/AnsiballZ_container_config_hash.py'
Dec 03 21:25:05 compute-0 sudo[221510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:05 compute-0 python3.9[221512]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 03 21:25:05 compute-0 sudo[221510]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:05 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v555: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:06 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 03 21:25:06 compute-0 sudo[221663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-runeveijlxvivuqdinvbqjinuzqmkljv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797106.2015586-520-176416767039950/AnsiballZ_podman_container_info.py'
Dec 03 21:25:06 compute-0 sudo[221663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:06 compute-0 python3.9[221665]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 03 21:25:07 compute-0 sudo[221663]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:07 compute-0 ceph-mon[75204]: pgmap v555: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:07 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v556: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:08 compute-0 sudo[221842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-resryncosjdjovxooxmdrgdlcseowqxg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764797107.799629-533-27972127011158/AnsiballZ_edpm_container_manage.py'
Dec 03 21:25:08 compute-0 sudo[221842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:08 compute-0 python3[221844]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 03 21:25:09 compute-0 podman[221855]: 2025-12-03 21:25:09.695447327 +0000 UTC m=+1.069111689 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 03 21:25:09 compute-0 ceph-mon[75204]: pgmap v556: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:09 compute-0 podman[221914]: 2025-12-03 21:25:09.860747348 +0000 UTC m=+0.070777018 container create 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:25:09 compute-0 podman[221914]: 2025-12-03 21:25:09.825092022 +0000 UTC m=+0.035121742 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 03 21:25:09 compute-0 python3[221844]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 03 21:25:09 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v557: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:10 compute-0 sudo[221842]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:10 compute-0 sudo[222100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrprkshsxltozoldhtpkqelrrdukkptr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797110.3054051-541-218158426711208/AnsiballZ_stat.py'
Dec 03 21:25:10 compute-0 sudo[222100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:10 compute-0 python3.9[222102]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:25:10 compute-0 sudo[222100]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:11 compute-0 sudo[222254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsnglqxrbxnohavzrbziplzjasfhydgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797111.2465158-550-267494174914332/AnsiballZ_file.py'
Dec 03 21:25:11 compute-0 sudo[222254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:11 compute-0 ceph-mon[75204]: pgmap v557: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:11 compute-0 python3.9[222256]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:11 compute-0 sudo[222254]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:11 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v558: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:12 compute-0 sudo[222330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlgbxmgwhfcyupkytbensjnvijsjlyau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797111.2465158-550-267494174914332/AnsiballZ_stat.py'
Dec 03 21:25:12 compute-0 sudo[222330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:12 compute-0 python3.9[222332]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:25:12 compute-0 sudo[222330]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:12 compute-0 sudo[222481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpcyzzushlkvvzkjpjassindbshgdmvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797112.4214425-550-92593012327089/AnsiballZ_copy.py'
Dec 03 21:25:12 compute-0 sudo[222481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:13 compute-0 python3.9[222483]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764797112.4214425-550-92593012327089/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:13 compute-0 sudo[222481]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:13 compute-0 podman[222484]: 2025-12-03 21:25:13.235392629 +0000 UTC m=+0.164311665 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:25:13 compute-0 sudo[222583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvyetdfotrskxgksiephwkhhxeiwmfxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797112.4214425-550-92593012327089/AnsiballZ_systemd.py'
Dec 03 21:25:13 compute-0 sudo[222583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:13 compute-0 ceph-mon[75204]: pgmap v558: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:13 compute-0 python3.9[222585]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 21:25:13 compute-0 systemd[1]: Reloading.
Dec 03 21:25:13 compute-0 systemd-rc-local-generator[222610]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:25:13 compute-0 systemd-sysv-generator[222615]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:25:13 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v559: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:14 compute-0 sudo[222583]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:14 compute-0 sudo[222693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seuteeprrevlzvxchgjtdxvxktsuaaxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797112.4214425-550-92593012327089/AnsiballZ_systemd.py'
Dec 03 21:25:14 compute-0 sudo[222693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:14 compute-0 python3.9[222695]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:14 compute-0 systemd[1]: Reloading.
Dec 03 21:25:15 compute-0 systemd-rc-local-generator[222719]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:25:15 compute-0 systemd-sysv-generator[222726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:25:15 compute-0 systemd[1]: Starting multipathd container...
Dec 03 21:25:15 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:25:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9490031f965e3c8c8ec7bffa43f620db69dfaa1dd92d685b2e794e3ebc30139/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9490031f965e3c8c8ec7bffa43f620db69dfaa1dd92d685b2e794e3ebc30139/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:15 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c.
Dec 03 21:25:15 compute-0 podman[222735]: 2025-12-03 21:25:15.618697337 +0000 UTC m=+0.282709220 container init 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Dec 03 21:25:15 compute-0 multipathd[222750]: + sudo -E kolla_set_configs
Dec 03 21:25:15 compute-0 sudo[222756]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 03 21:25:15 compute-0 sudo[222756]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 03 21:25:15 compute-0 podman[222735]: 2025-12-03 21:25:15.669731594 +0000 UTC m=+0.333743477 container start 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 03 21:25:15 compute-0 sudo[222756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 03 21:25:15 compute-0 podman[222735]: multipathd
Dec 03 21:25:15 compute-0 systemd[1]: Started multipathd container.
Dec 03 21:25:15 compute-0 sudo[222693]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:15 compute-0 multipathd[222750]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 03 21:25:15 compute-0 multipathd[222750]: INFO:__main__:Validating config file
Dec 03 21:25:15 compute-0 multipathd[222750]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 03 21:25:15 compute-0 multipathd[222750]: INFO:__main__:Writing out command to execute
Dec 03 21:25:15 compute-0 sudo[222756]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:15 compute-0 multipathd[222750]: ++ cat /run_command
Dec 03 21:25:15 compute-0 multipathd[222750]: + CMD='/usr/sbin/multipathd -d'
Dec 03 21:25:15 compute-0 multipathd[222750]: + ARGS=
Dec 03 21:25:15 compute-0 multipathd[222750]: + sudo kolla_copy_cacerts
Dec 03 21:25:15 compute-0 ceph-mon[75204]: pgmap v559: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:15 compute-0 sudo[222776]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 03 21:25:15 compute-0 sudo[222776]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 03 21:25:15 compute-0 sudo[222776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 03 21:25:15 compute-0 sudo[222776]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:15 compute-0 podman[222757]: 2025-12-03 21:25:15.789030812 +0000 UTC m=+0.102229311 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 03 21:25:15 compute-0 multipathd[222750]: + [[ ! -n '' ]]
Dec 03 21:25:15 compute-0 multipathd[222750]: + . kolla_extend_start
Dec 03 21:25:15 compute-0 multipathd[222750]: Running command: '/usr/sbin/multipathd -d'
Dec 03 21:25:15 compute-0 multipathd[222750]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 03 21:25:15 compute-0 multipathd[222750]: + umask 0022
Dec 03 21:25:15 compute-0 multipathd[222750]: + exec /usr/sbin/multipathd -d
Dec 03 21:25:15 compute-0 systemd[1]: 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c-2060e1ae219baa4e.service: Main process exited, code=exited, status=1/FAILURE
Dec 03 21:25:15 compute-0 systemd[1]: 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c-2060e1ae219baa4e.service: Failed with result 'exit-code'.
Dec 03 21:25:15 compute-0 multipathd[222750]: 3055.437631 | --------start up--------
Dec 03 21:25:15 compute-0 multipathd[222750]: 3055.437649 | read /etc/multipath.conf
Dec 03 21:25:15 compute-0 multipathd[222750]: 3055.446042 | path checkers start up
Dec 03 21:25:15 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v560: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:16 compute-0 python3.9[222938]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:25:17 compute-0 sudo[223090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbzborefnjwoedifhjbxfidhjwzqrxbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797116.7676373-586-108152520481610/AnsiballZ_command.py'
Dec 03 21:25:17 compute-0 sudo[223090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:17 compute-0 python3.9[223092]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:25:17 compute-0 sudo[223090]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:17 compute-0 ceph-mon[75204]: pgmap v560: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:17 compute-0 sudo[223255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngngfsynanxisrmezqfvpnbvuolhrvai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797117.6169379-594-131496850838384/AnsiballZ_systemd.py'
Dec 03 21:25:17 compute-0 sudo[223255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:17 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v561: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:18 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 03 21:25:18 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 03 21:25:18 compute-0 python3.9[223257]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:25:18 compute-0 systemd[1]: Stopping multipathd container...
Dec 03 21:25:18 compute-0 multipathd[222750]: 3057.990949 | exit (signal)
Dec 03 21:25:18 compute-0 multipathd[222750]: 3057.991058 | --------shut down-------
Dec 03 21:25:18 compute-0 systemd[1]: libpod-2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c.scope: Deactivated successfully.
Dec 03 21:25:18 compute-0 podman[223263]: 2025-12-03 21:25:18.391317479 +0000 UTC m=+0.077442937 container died 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 21:25:18 compute-0 systemd[1]: 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c-2060e1ae219baa4e.timer: Deactivated successfully.
Dec 03 21:25:18 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c.
Dec 03 21:25:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c-userdata-shm.mount: Deactivated successfully.
Dec 03 21:25:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-e9490031f965e3c8c8ec7bffa43f620db69dfaa1dd92d685b2e794e3ebc30139-merged.mount: Deactivated successfully.
Dec 03 21:25:18 compute-0 podman[223263]: 2025-12-03 21:25:18.650152458 +0000 UTC m=+0.336277876 container cleanup 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:25:18 compute-0 podman[223263]: multipathd
Dec 03 21:25:18 compute-0 podman[223293]: multipathd
Dec 03 21:25:18 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 03 21:25:18 compute-0 systemd[1]: Stopped multipathd container.
Dec 03 21:25:18 compute-0 systemd[1]: Starting multipathd container...
Dec 03 21:25:18 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:25:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9490031f965e3c8c8ec7bffa43f620db69dfaa1dd92d685b2e794e3ebc30139/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9490031f965e3c8c8ec7bffa43f620db69dfaa1dd92d685b2e794e3ebc30139/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:18 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c.
Dec 03 21:25:18 compute-0 podman[223306]: 2025-12-03 21:25:18.876190836 +0000 UTC m=+0.133292933 container init 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 03 21:25:18 compute-0 multipathd[223321]: + sudo -E kolla_set_configs
Dec 03 21:25:18 compute-0 podman[223306]: 2025-12-03 21:25:18.91324618 +0000 UTC m=+0.170348277 container start 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 03 21:25:18 compute-0 podman[223306]: multipathd
Dec 03 21:25:18 compute-0 sudo[223327]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 03 21:25:18 compute-0 sudo[223327]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 03 21:25:18 compute-0 sudo[223327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 03 21:25:18 compute-0 systemd[1]: Started multipathd container.
Dec 03 21:25:18 compute-0 sudo[223255]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:18 compute-0 multipathd[223321]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 03 21:25:18 compute-0 multipathd[223321]: INFO:__main__:Validating config file
Dec 03 21:25:18 compute-0 multipathd[223321]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 03 21:25:18 compute-0 multipathd[223321]: INFO:__main__:Writing out command to execute
Dec 03 21:25:19 compute-0 podman[223328]: 2025-12-03 21:25:19.007447925 +0000 UTC m=+0.082357528 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Dec 03 21:25:19 compute-0 sudo[223327]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:19 compute-0 systemd[1]: 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c-31dea9158dd74dbc.service: Main process exited, code=exited, status=1/FAILURE
Dec 03 21:25:19 compute-0 systemd[1]: 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c-31dea9158dd74dbc.service: Failed with result 'exit-code'.
Dec 03 21:25:19 compute-0 multipathd[223321]: ++ cat /run_command
Dec 03 21:25:19 compute-0 multipathd[223321]: + CMD='/usr/sbin/multipathd -d'
Dec 03 21:25:19 compute-0 multipathd[223321]: + ARGS=
Dec 03 21:25:19 compute-0 multipathd[223321]: + sudo kolla_copy_cacerts
Dec 03 21:25:19 compute-0 sudo[223356]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 03 21:25:19 compute-0 sudo[223356]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 03 21:25:19 compute-0 sudo[223356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 03 21:25:19 compute-0 sudo[223356]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:19 compute-0 multipathd[223321]: + [[ ! -n '' ]]
Dec 03 21:25:19 compute-0 multipathd[223321]: + . kolla_extend_start
Dec 03 21:25:19 compute-0 multipathd[223321]: Running command: '/usr/sbin/multipathd -d'
Dec 03 21:25:19 compute-0 multipathd[223321]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 03 21:25:19 compute-0 multipathd[223321]: + umask 0022
Dec 03 21:25:19 compute-0 multipathd[223321]: + exec /usr/sbin/multipathd -d
Dec 03 21:25:19 compute-0 multipathd[223321]: 3058.691925 | --------start up--------
Dec 03 21:25:19 compute-0 multipathd[223321]: 3058.691953 | read /etc/multipath.conf
Dec 03 21:25:19 compute-0 multipathd[223321]: 3058.698324 | path checkers start up
Dec 03 21:25:19 compute-0 sudo[223508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjaxdlhdfcakfuwpkodnkegjmrnixzvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797119.2103863-602-241495770888034/AnsiballZ_file.py'
Dec 03 21:25:19 compute-0 sudo[223508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:19 compute-0 ceph-mon[75204]: pgmap v561: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:19 compute-0 python3.9[223510]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:19 compute-0 sudo[223508]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:19 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v562: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:20 compute-0 sudo[223660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltkeizxtzwosklydoklgtxmeuhyqhigl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797120.3461914-614-46570912402731/AnsiballZ_file.py'
Dec 03 21:25:20 compute-0 sudo[223660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:20 compute-0 python3.9[223662]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 03 21:25:21 compute-0 sudo[223660]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:25:21
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['.mgr', 'vms', 'images', 'backups', 'volumes', 'cephfs.cephfs.data', 'cephfs.cephfs.meta']
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:25:21 compute-0 sudo[223812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzpuvxgyyzbprkfmszwqoryxeolktpoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797121.1945424-622-57809065480405/AnsiballZ_modprobe.py'
Dec 03 21:25:21 compute-0 sudo[223812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:21 compute-0 python3.9[223814]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:25:21 compute-0 kernel: Key type psk registered
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:25:21 compute-0 ceph-mon[75204]: pgmap v562: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:21 compute-0 sudo[223812]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:25:21 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v563: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:22 compute-0 sudo[223974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrovplbofzdxvulzxddagwrxbtzkukut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797122.0979817-630-12011584822208/AnsiballZ_stat.py'
Dec 03 21:25:22 compute-0 sudo[223974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:22 compute-0 python3.9[223976]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:25:22 compute-0 sudo[223974]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:23 compute-0 sudo[224097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggdjkqcnffntmyeuadoesnbhbfhcvenr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797122.0979817-630-12011584822208/AnsiballZ_copy.py'
Dec 03 21:25:23 compute-0 sudo[224097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:23 compute-0 python3.9[224099]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764797122.0979817-630-12011584822208/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:23 compute-0 sudo[224097]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:23 compute-0 ceph-mon[75204]: pgmap v563: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:23 compute-0 sudo[224249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqkxhoiysjgbyxbeswempcdyivmqkcbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797123.667489-646-89385656558735/AnsiballZ_lineinfile.py'
Dec 03 21:25:23 compute-0 sudo[224249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:23 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v564: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:24 compute-0 python3.9[224251]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:24 compute-0 sudo[224249]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:24 compute-0 sudo[224411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohitfsbvykuqymfggigqtyfsxzlaippu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797124.44157-654-216577778181638/AnsiballZ_systemd.py'
Dec 03 21:25:24 compute-0 sudo[224411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:24 compute-0 podman[224375]: 2025-12-03 21:25:24.902701204 +0000 UTC m=+0.087874227 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 03 21:25:25 compute-0 python3.9[224418]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:25:25 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 03 21:25:25 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 03 21:25:25 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 03 21:25:25 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 03 21:25:25 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 03 21:25:25 compute-0 sudo[224411]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:25 compute-0 ceph-mon[75204]: pgmap v564: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:25 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v565: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:26 compute-0 sudo[224576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwfkewslflwhrgpbcsmffbqyubmkjdgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797125.6687574-662-178931958117696/AnsiballZ_dnf.py'
Dec 03 21:25:26 compute-0 sudo[224576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:26 compute-0 python3.9[224578]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:25:27 compute-0 ceph-mon[75204]: pgmap v565: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:27 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v566: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:28 compute-0 systemd[1]: Reloading.
Dec 03 21:25:28 compute-0 systemd-sysv-generator[224613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:25:28 compute-0 systemd-rc-local-generator[224607]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:25:28 compute-0 systemd[1]: Reloading.
Dec 03 21:25:28 compute-0 systemd-rc-local-generator[224644]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:25:28 compute-0 systemd-sysv-generator[224647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:25:29 compute-0 systemd-logind[787]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 03 21:25:29 compute-0 systemd-logind[787]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 03 21:25:29 compute-0 lvm[224693]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:25:29 compute-0 lvm[224691]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:25:29 compute-0 lvm[224691]: VG ceph_vg1 finished
Dec 03 21:25:29 compute-0 lvm[224693]: VG ceph_vg2 finished
Dec 03 21:25:29 compute-0 lvm[224692]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:25:29 compute-0 lvm[224692]: VG ceph_vg0 finished
Dec 03 21:25:29 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 03 21:25:29 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 03 21:25:29 compute-0 systemd[1]: Reloading.
Dec 03 21:25:29 compute-0 systemd-sysv-generator[224750]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:25:29 compute-0 systemd-rc-local-generator[224747]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:25:29 compute-0 ceph-mon[75204]: pgmap v566: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:29 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 03 21:25:29 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v567: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:30 compute-0 sudo[224576]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:30 compute-0 sudo[225911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyxkwfapbxryezcehhmjgzwvawmvbxyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797130.4841547-670-110325691329622/AnsiballZ_systemd_service.py'
Dec 03 21:25:30 compute-0 sudo[225911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:31 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 03 21:25:31 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 03 21:25:31 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.912s CPU time.
Dec 03 21:25:31 compute-0 python3.9[225940]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:25:31 compute-0 systemd[1]: run-rba4d539b72704003985510d4409248f5.service: Deactivated successfully.
Dec 03 21:25:31 compute-0 systemd[1]: Stopping Open-iSCSI...
Dec 03 21:25:31 compute-0 iscsid[213732]: iscsid shutting down.
Dec 03 21:25:31 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Dec 03 21:25:31 compute-0 systemd[1]: Stopped Open-iSCSI.
Dec 03 21:25:31 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 03 21:25:31 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 03 21:25:31 compute-0 systemd[1]: Started Open-iSCSI.
Dec 03 21:25:31 compute-0 sudo[225911]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:31 compute-0 ceph-mon[75204]: pgmap v567: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:31 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v568: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:32 compute-0 python3.9[226191]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 03 21:25:32 compute-0 sudo[226345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnrqewwxqgomqywjlutdeflmuoekolkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797132.675446-688-268406364361957/AnsiballZ_file.py'
Dec 03 21:25:32 compute-0 sudo[226345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:33 compute-0 python3.9[226347]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:33 compute-0 sudo[226345]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:33 compute-0 ceph-mon[75204]: pgmap v568: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:33 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v569: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:34 compute-0 sudo[226497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfwsapqtrungowtpwhxkfwkyqteedusk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797133.6862257-699-212854164480381/AnsiballZ_systemd_service.py'
Dec 03 21:25:34 compute-0 sudo[226497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:34 compute-0 python3.9[226499]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 21:25:34 compute-0 systemd[1]: Reloading.
Dec 03 21:25:34 compute-0 systemd-rc-local-generator[226527]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:25:34 compute-0 systemd-sysv-generator[226531]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:25:34 compute-0 sudo[226534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:25:34 compute-0 sudo[226534]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:34 compute-0 sudo[226534]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:34 compute-0 sudo[226497]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:34 compute-0 sudo[226561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:25:34 compute-0 sudo[226561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:35 compute-0 sudo[226561]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:25:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:25:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:25:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:25:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:25:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:25:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:25:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:25:35 compute-0 sudo[226765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:25:35 compute-0 sudo[226765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:35 compute-0 sudo[226765]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:35 compute-0 python3.9[226757]: ansible-ansible.builtin.service_facts Invoked
Dec 03 21:25:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #30. Immutable memtables: 0.
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.503802) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 30
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797135503820, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1007, "num_deletes": 251, "total_data_size": 1003566, "memory_usage": 1022512, "flush_reason": "Manual Compaction"}
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #31: started
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797135509548, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 31, "file_size": 607958, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11096, "largest_seqno": 12102, "table_properties": {"data_size": 604076, "index_size": 1534, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9761, "raw_average_key_size": 19, "raw_value_size": 595817, "raw_average_value_size": 1215, "num_data_blocks": 70, "num_entries": 490, "num_filter_entries": 490, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764797034, "oldest_key_time": 1764797034, "file_creation_time": 1764797135, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 31, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 5795 microseconds, and 2138 cpu microseconds.
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.509596) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #31: 607958 bytes OK
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.509608) [db/memtable_list.cc:519] [default] Level-0 commit table #31 started
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.510598) [db/memtable_list.cc:722] [default] Level-0 commit table #31: memtable #1 done
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.510611) EVENT_LOG_v1 {"time_micros": 1764797135510608, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.510624) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 998834, prev total WAL file size 998834, number of live WAL files 2.
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000027.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.510996) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323531' seq:72057594037927935, type:22 .. '6D67727374617400353033' seq:0, type:0; will stop at (end)
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [31(593KB)], [29(5876KB)]
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797135511054, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [31], "files_L6": [29], "score": -1, "input_data_size": 6625368, "oldest_snapshot_seqno": -1}
Dec 03 21:25:35 compute-0 sudo[226790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:25:35 compute-0 sudo[226790]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:35 compute-0 network[226831]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 03 21:25:35 compute-0 network[226832]: 'network-scripts' will be removed from distribution in near future.
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #32: 3229 keys, 4889473 bytes, temperature: kUnknown
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797135545854, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 32, "file_size": 4889473, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4866650, "index_size": 13626, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8133, "raw_key_size": 74733, "raw_average_key_size": 23, "raw_value_size": 4807579, "raw_average_value_size": 1488, "num_data_blocks": 606, "num_entries": 3229, "num_filter_entries": 3229, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764797135, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:25:35 compute-0 network[226833]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.546207) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 4889473 bytes
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.547490) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 189.3 rd, 139.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 5.7 +0.0 blob) out(4.7 +0.0 blob), read-write-amplify(18.9) write-amplify(8.0) OK, records in: 3698, records dropped: 469 output_compression: NoCompression
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.547509) EVENT_LOG_v1 {"time_micros": 1764797135547500, "job": 12, "event": "compaction_finished", "compaction_time_micros": 35007, "compaction_time_cpu_micros": 11475, "output_level": 6, "num_output_files": 1, "total_output_size": 4889473, "num_input_records": 3698, "num_output_records": 3229, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000031.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797135547768, "job": 12, "event": "table_file_deletion", "file_number": 31}
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797135548878, "job": 12, "event": "table_file_deletion", "file_number": 29}
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.510929) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.548928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.548932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.548934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.548935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:25:35 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:25:35.548937) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:25:35 compute-0 podman[226851]: 2025-12-03 21:25:35.783548205 +0000 UTC m=+0.038132184 container create 17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_lamport, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:25:35 compute-0 podman[226851]: 2025-12-03 21:25:35.767173746 +0000 UTC m=+0.021757745 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:25:35 compute-0 ceph-mon[75204]: pgmap v569: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:25:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:25:35 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v570: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:36 compute-0 systemd[1]: Started libpod-conmon-17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2.scope.
Dec 03 21:25:36 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:25:36 compute-0 podman[226851]: 2025-12-03 21:25:36.521153617 +0000 UTC m=+0.775737646 container init 17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_lamport, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:25:36 compute-0 podman[226851]: 2025-12-03 21:25:36.528967547 +0000 UTC m=+0.783551566 container start 17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_lamport, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:25:36 compute-0 podman[226851]: 2025-12-03 21:25:36.533370864 +0000 UTC m=+0.787954883 container attach 17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_lamport, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20251030)
Dec 03 21:25:36 compute-0 dazzling_lamport[226868]: 167 167
Dec 03 21:25:36 compute-0 systemd[1]: libpod-17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2.scope: Deactivated successfully.
Dec 03 21:25:36 compute-0 podman[226851]: 2025-12-03 21:25:36.538885992 +0000 UTC m=+0.793470011 container died 17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_lamport, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:25:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-bb4f059f102988e17dc9cafdcbf454545ddc1eb398a5695e48549f2e5dd91efe-merged.mount: Deactivated successfully.
Dec 03 21:25:36 compute-0 podman[226851]: 2025-12-03 21:25:36.585753219 +0000 UTC m=+0.840337218 container remove 17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dazzling_lamport, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 03 21:25:36 compute-0 systemd[1]: libpod-conmon-17c7191304344d147fb67538dbb3d3e1603de65355d2470936f4019275967ce2.scope: Deactivated successfully.
Dec 03 21:25:36 compute-0 podman[226902]: 2025-12-03 21:25:36.817513781 +0000 UTC m=+0.052524269 container create 03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zealous_jackson, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:25:36 compute-0 systemd[1]: Started libpod-conmon-03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9.scope.
Dec 03 21:25:36 compute-0 podman[226902]: 2025-12-03 21:25:36.794769622 +0000 UTC m=+0.029780110 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:25:36 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:25:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fae31c25abede7f33f0ff9972968f33fe2cac4ef23cfc1479d7ceb982cc2dd/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fae31c25abede7f33f0ff9972968f33fe2cac4ef23cfc1479d7ceb982cc2dd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fae31c25abede7f33f0ff9972968f33fe2cac4ef23cfc1479d7ceb982cc2dd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fae31c25abede7f33f0ff9972968f33fe2cac4ef23cfc1479d7ceb982cc2dd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3fae31c25abede7f33f0ff9972968f33fe2cac4ef23cfc1479d7ceb982cc2dd/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:36 compute-0 podman[226902]: 2025-12-03 21:25:36.941373901 +0000 UTC m=+0.176384449 container init 03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zealous_jackson, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:25:36 compute-0 podman[226902]: 2025-12-03 21:25:36.959417445 +0000 UTC m=+0.194427913 container start 03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zealous_jackson, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:25:36 compute-0 podman[226902]: 2025-12-03 21:25:36.963816023 +0000 UTC m=+0.198826481 container attach 03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zealous_jackson, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:25:37 compute-0 zealous_jackson[226922]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:25:37 compute-0 zealous_jackson[226922]: --> All data devices are unavailable
Dec 03 21:25:37 compute-0 systemd[1]: libpod-03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9.scope: Deactivated successfully.
Dec 03 21:25:37 compute-0 podman[226902]: 2025-12-03 21:25:37.590439631 +0000 UTC m=+0.825450119 container died 03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zealous_jackson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:25:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-a3fae31c25abede7f33f0ff9972968f33fe2cac4ef23cfc1479d7ceb982cc2dd-merged.mount: Deactivated successfully.
Dec 03 21:25:37 compute-0 podman[226902]: 2025-12-03 21:25:37.644068028 +0000 UTC m=+0.879078486 container remove 03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zealous_jackson, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 03 21:25:37 compute-0 systemd[1]: libpod-conmon-03a4f458781716710141b045e7777268b94c396267a5133e262d796ed7345cb9.scope: Deactivated successfully.
Dec 03 21:25:37 compute-0 sudo[226790]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:37 compute-0 sudo[226981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:25:37 compute-0 sudo[226981]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:37 compute-0 sudo[226981]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:37 compute-0 sudo[227010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:25:37 compute-0 sudo[227010]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:37 compute-0 ceph-mon[75204]: pgmap v570: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v571: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:38 compute-0 podman[227061]: 2025-12-03 21:25:38.161846538 +0000 UTC m=+0.026098882 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:25:38 compute-0 podman[227061]: 2025-12-03 21:25:38.375287578 +0000 UTC m=+0.239539932 container create c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_nightingale, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:25:38 compute-0 systemd[1]: Started libpod-conmon-c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06.scope.
Dec 03 21:25:38 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:25:38 compute-0 podman[227061]: 2025-12-03 21:25:38.478140676 +0000 UTC m=+0.342393020 container init c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_nightingale, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:25:38 compute-0 podman[227061]: 2025-12-03 21:25:38.488666608 +0000 UTC m=+0.352918972 container start c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_nightingale, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:25:38 compute-0 podman[227061]: 2025-12-03 21:25:38.493334033 +0000 UTC m=+0.357586377 container attach c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_nightingale, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:25:38 compute-0 competent_nightingale[227078]: 167 167
Dec 03 21:25:38 compute-0 systemd[1]: libpod-c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06.scope: Deactivated successfully.
Dec 03 21:25:38 compute-0 podman[227061]: 2025-12-03 21:25:38.495263865 +0000 UTC m=+0.359516189 container died c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_nightingale, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:25:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-7229ca75e3694cf335413c3542b79580ea79994f98c7a4406ebea057a6484a95-merged.mount: Deactivated successfully.
Dec 03 21:25:38 compute-0 podman[227061]: 2025-12-03 21:25:38.531763934 +0000 UTC m=+0.396016258 container remove c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_nightingale, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:25:38 compute-0 systemd[1]: libpod-conmon-c7aa61dbc70a324a770080fda4e73b1c6d478c1c8e978397b3761e62b6af4a06.scope: Deactivated successfully.
Dec 03 21:25:38 compute-0 podman[227102]: 2025-12-03 21:25:38.775830526 +0000 UTC m=+0.071377595 container create f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_mendel, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:25:38 compute-0 systemd[1]: Started libpod-conmon-f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161.scope.
Dec 03 21:25:38 compute-0 podman[227102]: 2025-12-03 21:25:38.745691018 +0000 UTC m=+0.041238147 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:25:38 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:25:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c8c8d5c9d9bfffaf111fc96a4d44cc8766e6bfb1e71fe90751bff70b112b814/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c8c8d5c9d9bfffaf111fc96a4d44cc8766e6bfb1e71fe90751bff70b112b814/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c8c8d5c9d9bfffaf111fc96a4d44cc8766e6bfb1e71fe90751bff70b112b814/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c8c8d5c9d9bfffaf111fc96a4d44cc8766e6bfb1e71fe90751bff70b112b814/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:38 compute-0 podman[227102]: 2025-12-03 21:25:38.869712792 +0000 UTC m=+0.165259831 container init f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_mendel, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS)
Dec 03 21:25:38 compute-0 podman[227102]: 2025-12-03 21:25:38.879525976 +0000 UTC m=+0.175073015 container start f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_mendel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:25:38 compute-0 podman[227102]: 2025-12-03 21:25:38.882889785 +0000 UTC m=+0.178436824 container attach f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_mendel, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:25:39 compute-0 interesting_mendel[227118]: {
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:     "0": [
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:         {
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "devices": [
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "/dev/loop3"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             ],
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_name": "ceph_lv0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_size": "21470642176",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "name": "ceph_lv0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "tags": {
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cluster_name": "ceph",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.crush_device_class": "",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.encrypted": "0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.objectstore": "bluestore",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osd_id": "0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.type": "block",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.vdo": "0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.with_tpm": "0"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             },
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "type": "block",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "vg_name": "ceph_vg0"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:         }
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:     ],
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:     "1": [
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:         {
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "devices": [
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "/dev/loop4"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             ],
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_name": "ceph_lv1",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_size": "21470642176",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "name": "ceph_lv1",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "tags": {
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cluster_name": "ceph",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.crush_device_class": "",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.encrypted": "0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.objectstore": "bluestore",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osd_id": "1",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.type": "block",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.vdo": "0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.with_tpm": "0"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             },
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "type": "block",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "vg_name": "ceph_vg1"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:         }
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:     ],
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:     "2": [
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:         {
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "devices": [
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "/dev/loop5"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             ],
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_name": "ceph_lv2",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_size": "21470642176",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "name": "ceph_lv2",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "tags": {
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.cluster_name": "ceph",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.crush_device_class": "",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.encrypted": "0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.objectstore": "bluestore",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osd_id": "2",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.type": "block",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.vdo": "0",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:                 "ceph.with_tpm": "0"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             },
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "type": "block",
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:             "vg_name": "ceph_vg2"
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:         }
Dec 03 21:25:39 compute-0 interesting_mendel[227118]:     ]
Dec 03 21:25:39 compute-0 interesting_mendel[227118]: }
Dec 03 21:25:39 compute-0 podman[227102]: 2025-12-03 21:25:39.183666868 +0000 UTC m=+0.479213967 container died f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_mendel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:25:39 compute-0 systemd[1]: libpod-f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161.scope: Deactivated successfully.
Dec 03 21:25:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-3c8c8d5c9d9bfffaf111fc96a4d44cc8766e6bfb1e71fe90751bff70b112b814-merged.mount: Deactivated successfully.
Dec 03 21:25:39 compute-0 podman[227102]: 2025-12-03 21:25:39.309637034 +0000 UTC m=+0.605184113 container remove f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=interesting_mendel, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:25:39 compute-0 systemd[1]: libpod-conmon-f00adc8f13a206823081b50863a61085dca880cfa828645163398fd6b8fb0161.scope: Deactivated successfully.
Dec 03 21:25:39 compute-0 sudo[227010]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:39 compute-0 sudo[227149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:25:39 compute-0 sudo[227149]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:39 compute-0 sudo[227149]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:39 compute-0 sudo[227177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:25:39 compute-0 sudo[227177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:39 compute-0 podman[227228]: 2025-12-03 21:25:39.853997787 +0000 UTC m=+0.063150764 container create 5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_dirac, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:25:39 compute-0 systemd[1]: Started libpod-conmon-5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1.scope.
Dec 03 21:25:39 compute-0 ceph-mon[75204]: pgmap v571: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:25:39 compute-0 podman[227228]: 2025-12-03 21:25:39.828222837 +0000 UTC m=+0.037375844 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:25:39 compute-0 podman[227228]: 2025-12-03 21:25:39.935955884 +0000 UTC m=+0.145108891 container init 5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_dirac, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:25:39 compute-0 podman[227228]: 2025-12-03 21:25:39.942998493 +0000 UTC m=+0.152151460 container start 5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_dirac, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:25:39 compute-0 podman[227228]: 2025-12-03 21:25:39.946388354 +0000 UTC m=+0.155541321 container attach 5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_dirac, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:25:39 compute-0 awesome_dirac[227251]: 167 167
Dec 03 21:25:39 compute-0 systemd[1]: libpod-5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1.scope: Deactivated successfully.
Dec 03 21:25:39 compute-0 podman[227228]: 2025-12-03 21:25:39.950699419 +0000 UTC m=+0.159852446 container died 5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_dirac, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:25:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-74bb4123efd354ab8c43b330715d654bdcdbce51d270add67dfe31afbb4776d5-merged.mount: Deactivated successfully.
Dec 03 21:25:39 compute-0 podman[227228]: 2025-12-03 21:25:39.993651971 +0000 UTC m=+0.202804908 container remove 5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=awesome_dirac, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:25:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v572: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:40 compute-0 systemd[1]: libpod-conmon-5be326ca7e262818c001620fb12e80572a9e602169197d5976c2358b9e828fc1.scope: Deactivated successfully.
Dec 03 21:25:40 compute-0 podman[227285]: 2025-12-03 21:25:40.243872508 +0000 UTC m=+0.068838996 container create 3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_edison, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:25:40 compute-0 systemd[1]: Started libpod-conmon-3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb.scope.
Dec 03 21:25:40 compute-0 podman[227285]: 2025-12-03 21:25:40.213101693 +0000 UTC m=+0.038068201 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:25:40 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:25:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf85d79994d5aab1075fb823d7ed21f36245054ef83308b102d15d062934c9e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf85d79994d5aab1075fb823d7ed21f36245054ef83308b102d15d062934c9e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf85d79994d5aab1075fb823d7ed21f36245054ef83308b102d15d062934c9e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/baf85d79994d5aab1075fb823d7ed21f36245054ef83308b102d15d062934c9e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:25:40 compute-0 podman[227285]: 2025-12-03 21:25:40.358158802 +0000 UTC m=+0.183125310 container init 3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_edison, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:25:40 compute-0 podman[227285]: 2025-12-03 21:25:40.380182912 +0000 UTC m=+0.205149410 container start 3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_edison, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:25:40 compute-0 podman[227285]: 2025-12-03 21:25:40.385408752 +0000 UTC m=+0.210375300 container attach 3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_edison, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:25:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:41 compute-0 lvm[227493]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:25:41 compute-0 lvm[227493]: VG ceph_vg1 finished
Dec 03 21:25:41 compute-0 lvm[227492]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:25:41 compute-0 lvm[227492]: VG ceph_vg0 finished
Dec 03 21:25:41 compute-0 lvm[227512]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:25:41 compute-0 lvm[227512]: VG ceph_vg2 finished
Dec 03 21:25:41 compute-0 sudo[227546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzzfmhrfyyrdgaredjpdtdswycrqftax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797140.8912778-718-118000957061143/AnsiballZ_systemd_service.py'
Dec 03 21:25:41 compute-0 sudo[227546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:41 compute-0 crazy_edison[227306]: {}
Dec 03 21:25:41 compute-0 systemd[1]: libpod-3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb.scope: Deactivated successfully.
Dec 03 21:25:41 compute-0 systemd[1]: libpod-3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb.scope: Consumed 1.360s CPU time.
Dec 03 21:25:41 compute-0 podman[227285]: 2025-12-03 21:25:41.314898788 +0000 UTC m=+1.139865286 container died 3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_edison, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:25:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-baf85d79994d5aab1075fb823d7ed21f36245054ef83308b102d15d062934c9e-merged.mount: Deactivated successfully.
Dec 03 21:25:41 compute-0 podman[227285]: 2025-12-03 21:25:41.362648358 +0000 UTC m=+1.187614856 container remove 3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=crazy_edison, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:25:41 compute-0 systemd[1]: libpod-conmon-3c86059e20a266c6a0dea653ebaaa3ad3fe58d4fd2a287912c12c94678d97afb.scope: Deactivated successfully.
Dec 03 21:25:41 compute-0 sudo[227177]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:25:41 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:25:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:25:41 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:25:41 compute-0 sudo[227563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:25:41 compute-0 sudo[227563]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:25:41 compute-0 sudo[227563]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:41 compute-0 python3.9[227549]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:41 compute-0 sudo[227546]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:41 compute-0 ceph-mon[75204]: pgmap v572: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:25:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:25:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v573: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:42 compute-0 sudo[227738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-patztqmhtmavstcelpmquwqmpsysonwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797141.7116559-718-38251669335936/AnsiballZ_systemd_service.py'
Dec 03 21:25:42 compute-0 sudo[227738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:42 compute-0 python3.9[227740]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:42 compute-0 sudo[227738]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:43 compute-0 sudo[227891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llfnqrwwxclwjcgboxtskiouaquqdkpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797142.6420484-718-243573093666122/AnsiballZ_systemd_service.py'
Dec 03 21:25:43 compute-0 sudo[227891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:43 compute-0 python3.9[227893]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:43 compute-0 sudo[227891]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:43 compute-0 podman[227895]: 2025-12-03 21:25:43.514387307 +0000 UTC m=+0.122754861 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 03 21:25:43 compute-0 sudo[228070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlouwtjtnlrrjtfysyujpnqovsybmlky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797143.5311995-718-134425069924475/AnsiballZ_systemd_service.py'
Dec 03 21:25:43 compute-0 sudo[228070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:43 compute-0 ceph-mon[75204]: pgmap v573: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v574: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:44 compute-0 python3.9[228072]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:44 compute-0 sudo[228070]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:44 compute-0 sudo[228223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwzrglraaiqeiuamqtqwywhrhdprxlia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797144.3814373-718-222969203855155/AnsiballZ_systemd_service.py'
Dec 03 21:25:44 compute-0 sudo[228223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:45 compute-0 python3.9[228225]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:45 compute-0 sudo[228223]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:45 compute-0 sudo[228376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhkayhjgdyysgbgdrtvfqfqamnzphsxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797145.2628014-718-149684364616670/AnsiballZ_systemd_service.py'
Dec 03 21:25:45 compute-0 sudo[228376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:45 compute-0 ceph-mon[75204]: pgmap v574: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v575: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:46 compute-0 python3.9[228378]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:46 compute-0 sudo[228376]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:46 compute-0 sudo[228529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtvbimahsxgffiknzyvozciflebugiaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797146.2975516-718-88700375439120/AnsiballZ_systemd_service.py'
Dec 03 21:25:46 compute-0 sudo[228529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:46 compute-0 python3.9[228531]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:47 compute-0 sudo[228529]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:47 compute-0 sudo[228682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vefeqkjfpdczrppjonjnnoanylfqtdqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797147.1789432-718-204501621411445/AnsiballZ_systemd_service.py'
Dec 03 21:25:47 compute-0 sudo[228682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:47 compute-0 python3.9[228684]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:25:47 compute-0 sudo[228682]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:47 compute-0 ceph-mon[75204]: pgmap v575: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v576: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:48 compute-0 sudo[228835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybadcceqlyhhqfsmerajiqhozgbhjwpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797148.2459142-777-154656127628485/AnsiballZ_file.py'
Dec 03 21:25:48 compute-0 sudo[228835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:48 compute-0 python3.9[228837]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:48 compute-0 sudo[228835]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:25:48.928 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:25:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:25:48.929 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:25:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:25:48.929 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:25:49 compute-0 podman[228914]: 2025-12-03 21:25:49.131735306 +0000 UTC m=+0.062451685 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 03 21:25:49 compute-0 sudo[229008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgchxljddkdptuiajsvpijoydaivzhlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797148.945914-777-193181096910299/AnsiballZ_file.py'
Dec 03 21:25:49 compute-0 sudo[229008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:49 compute-0 python3.9[229010]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:49 compute-0 sudo[229008]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:49 compute-0 ceph-mon[75204]: pgmap v576: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v577: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:50 compute-0 sudo[229160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwgcdgiaigvdlbiosehnhrdabvmytacl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797149.6524546-777-141064922395009/AnsiballZ_file.py'
Dec 03 21:25:50 compute-0 sudo[229160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:50 compute-0 python3.9[229162]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:50 compute-0 sudo[229160]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:50 compute-0 sudo[229312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmmpkiytsihucwayrnzrovwwqmlpgnth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797150.4591715-777-174532543465794/AnsiballZ_file.py'
Dec 03 21:25:50 compute-0 sudo[229312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:51 compute-0 python3.9[229314]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:51 compute-0 sudo[229312]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:51 compute-0 sudo[229464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnfqtgidojueohwgobevibzpmhycginv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797151.2623818-777-181424182314444/AnsiballZ_file.py'
Dec 03 21:25:51 compute-0 sudo[229464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:25:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:25:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:25:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:25:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:25:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:25:51 compute-0 python3.9[229466]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:51 compute-0 sudo[229464]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:51 compute-0 ceph-mon[75204]: pgmap v577: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v578: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:52 compute-0 sudo[229616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwvhzqtfkuckmzmtbfuxfhzfknyxftij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797151.9932413-777-65681632587363/AnsiballZ_file.py'
Dec 03 21:25:52 compute-0 sudo[229616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:52 compute-0 python3.9[229618]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:52 compute-0 sudo[229616]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:53 compute-0 sudo[229768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zehpqppudmfnizdakgmplhazgtahhhyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797152.6975894-777-81470925380270/AnsiballZ_file.py'
Dec 03 21:25:53 compute-0 sudo[229768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:53 compute-0 python3.9[229770]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:53 compute-0 sudo[229768]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:53 compute-0 sudo[229920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuxtllmgofovlynxhtwbemiwalqjnjyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797153.4834526-777-169722050276749/AnsiballZ_file.py'
Dec 03 21:25:53 compute-0 sudo[229920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:53 compute-0 ceph-mon[75204]: pgmap v578: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v579: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:54 compute-0 python3.9[229922]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:54 compute-0 sudo[229920]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:54 compute-0 sudo[230072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbdjcxokopiqdnllbftrmoouqrcxjjqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797154.3005717-834-4766253265308/AnsiballZ_file.py'
Dec 03 21:25:54 compute-0 sudo[230072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:54 compute-0 python3.9[230074]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:54 compute-0 sudo[230072]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:55 compute-0 podman[230099]: 2025-12-03 21:25:55.132244217 +0000 UTC m=+0.068996661 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 03 21:25:55 compute-0 sudo[230241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eewbaiefppoblicnqhlkyxghlyrabfxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797155.0960474-834-159383468666570/AnsiballZ_file.py'
Dec 03 21:25:55 compute-0 sudo[230241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:25:55 compute-0 python3.9[230243]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:55 compute-0 sudo[230241]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:55 compute-0 ceph-mon[75204]: pgmap v579: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v580: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:56 compute-0 sudo[230393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ityuchbtgbtalsifomxacjktmifigjro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797155.8012874-834-88457816632322/AnsiballZ_file.py'
Dec 03 21:25:56 compute-0 sudo[230393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:56 compute-0 python3.9[230395]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:56 compute-0 sudo[230393]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:56 compute-0 sudo[230545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiycuynjndiwkbbqcqwgndzqxfnpuglp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797156.5137675-834-69909813677017/AnsiballZ_file.py'
Dec 03 21:25:56 compute-0 sudo[230545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:56 compute-0 ceph-mon[75204]: pgmap v580: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:57 compute-0 python3.9[230547]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:57 compute-0 sudo[230545]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:57 compute-0 sudo[230697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zywakqtpqseyyqztkoygpssxeoegxxvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797157.1939995-834-148431659924559/AnsiballZ_file.py'
Dec 03 21:25:57 compute-0 sudo[230697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:57 compute-0 python3.9[230699]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:57 compute-0 sudo[230697]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v581: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:58 compute-0 sudo[230849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxgpxsprupbtajmollkgozlqxvdfmkcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797157.8803477-834-262358492600415/AnsiballZ_file.py'
Dec 03 21:25:58 compute-0 sudo[230849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:58 compute-0 python3.9[230851]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:58 compute-0 sudo[230849]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:59 compute-0 ceph-mon[75204]: pgmap v581: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:25:59 compute-0 sudo[231001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deifoqpowpufaxkckbkaqooiityueisx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797158.6900158-834-200066567948399/AnsiballZ_file.py'
Dec 03 21:25:59 compute-0 sudo[231001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:25:59 compute-0 python3.9[231003]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:25:59 compute-0 sudo[231001]: pam_unix(sudo:session): session closed for user root
Dec 03 21:25:59 compute-0 sudo[231153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miwsihionuqqyvywavzzcoybsokwetgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797159.4838457-834-9843003548620/AnsiballZ_file.py'
Dec 03 21:25:59 compute-0 sudo[231153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:00 compute-0 python3.9[231155]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:26:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v582: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:00 compute-0 sudo[231153]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:00 compute-0 sudo[231305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhqwjqeufesxymoswahkteericemthvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797160.3405285-892-126586007742585/AnsiballZ_command.py'
Dec 03 21:26:00 compute-0 sudo[231305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:00 compute-0 python3.9[231307]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:00 compute-0 sudo[231305]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:01 compute-0 ceph-mon[75204]: pgmap v582: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:01 compute-0 python3.9[231459]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 03 21:26:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v583: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:02 compute-0 sudo[231609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfuefjqfojnppafgvzqulvbevdxzjwqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797162.2600842-910-237092607609741/AnsiballZ_systemd_service.py'
Dec 03 21:26:02 compute-0 sudo[231609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:02 compute-0 python3.9[231611]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 21:26:02 compute-0 systemd[1]: Reloading.
Dec 03 21:26:03 compute-0 ceph-mon[75204]: pgmap v583: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:03 compute-0 systemd-sysv-generator[231642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:26:03 compute-0 systemd-rc-local-generator[231638]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:26:03 compute-0 sudo[231609]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:03 compute-0 sudo[231796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynddhiekxzwwgeunkabirkjbivpyqixq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797163.5849042-918-89953162675411/AnsiballZ_command.py'
Dec 03 21:26:03 compute-0 sudo[231796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v584: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:04 compute-0 python3.9[231798]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:04 compute-0 sudo[231796]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:04 compute-0 sudo[231949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulyxdrelyxiwjopsvrkbglqpcfdoeitm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797164.4249222-918-258799032602445/AnsiballZ_command.py'
Dec 03 21:26:04 compute-0 sudo[231949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:04 compute-0 python3.9[231951]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:04 compute-0 sudo[231949]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:05 compute-0 ceph-mon[75204]: pgmap v584: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:05 compute-0 sudo[232102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxffvzlnhyqljqzojsijwvccffhabtec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797165.164125-918-127554427786706/AnsiballZ_command.py'
Dec 03 21:26:05 compute-0 sudo[232102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:05 compute-0 python3.9[232104]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:05 compute-0 sudo[232102]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v585: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:06 compute-0 sudo[232255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juiceqkbceprinzaegtyfavdrlruhizx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797165.884402-918-137633444308655/AnsiballZ_command.py'
Dec 03 21:26:06 compute-0 sudo[232255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:06 compute-0 python3.9[232257]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:06 compute-0 sudo[232255]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:07 compute-0 sudo[232408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usrehvulnggzboblvxvwlhakekijcana ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797166.626062-918-124293821660601/AnsiballZ_command.py'
Dec 03 21:26:07 compute-0 sudo[232408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:07 compute-0 python3.9[232410]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:07 compute-0 ceph-mon[75204]: pgmap v585: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:07 compute-0 sudo[232408]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:07 compute-0 sudo[232561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdrezjcjjjaehevmbghgswmdmxvfbgrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797167.445351-918-97054011811596/AnsiballZ_command.py'
Dec 03 21:26:07 compute-0 sudo[232561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v586: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:08 compute-0 python3.9[232563]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:08 compute-0 sudo[232561]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:08 compute-0 sudo[232714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yemrlbqxrjsjlfolcbcrfbbhexnsfked ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797168.2625701-918-105629807752332/AnsiballZ_command.py'
Dec 03 21:26:08 compute-0 sudo[232714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:08 compute-0 python3.9[232716]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:08 compute-0 sudo[232714]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:09 compute-0 ceph-mon[75204]: pgmap v586: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:09 compute-0 sudo[232867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puixcfyznoqjzvarwzhfxphbnamzywqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797168.9713817-918-265671425051328/AnsiballZ_command.py'
Dec 03 21:26:09 compute-0 sudo[232867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:09 compute-0 python3.9[232869]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 03 21:26:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v587: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:10 compute-0 sudo[232867]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:11 compute-0 ceph-mon[75204]: pgmap v587: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v588: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:12 compute-0 sudo[233020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pveiejuwcdhrvgviqkmwufiaciwggxzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797171.7708592-997-66566080124832/AnsiballZ_file.py'
Dec 03 21:26:12 compute-0 sudo[233020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:12 compute-0 python3.9[233022]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:12 compute-0 sudo[233020]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:12 compute-0 sudo[233172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teuhuokmieiykvuhxfvghoaqxhjuuyqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797172.594245-997-222715227361482/AnsiballZ_file.py'
Dec 03 21:26:12 compute-0 sudo[233172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:13 compute-0 python3.9[233174]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:13 compute-0 sudo[233172]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:13 compute-0 ceph-mon[75204]: pgmap v588: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:13 compute-0 sudo[233335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyuzeeouvvakdgrpfiadvktyzwbmhgnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797173.3778236-997-117609597875516/AnsiballZ_file.py'
Dec 03 21:26:13 compute-0 sudo[233335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:13 compute-0 podman[233298]: 2025-12-03 21:26:13.84311763 +0000 UTC m=+0.144307070 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:26:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v589: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:14 compute-0 python3.9[233337]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:14 compute-0 sudo[233335]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:14 compute-0 sudo[233500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyqurfkjfcxsnoupczodgbbxmaiefrdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797174.3900037-1019-235580275262284/AnsiballZ_file.py'
Dec 03 21:26:14 compute-0 sudo[233500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:14 compute-0 python3.9[233502]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:14 compute-0 sudo[233500]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:15 compute-0 ceph-mon[75204]: pgmap v589: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:15 compute-0 sudo[233652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpheyyzobffbannydlbccmbsulxgkpoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797175.1447093-1019-229443727141689/AnsiballZ_file.py'
Dec 03 21:26:15 compute-0 sudo[233652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:15 compute-0 python3.9[233654]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:15 compute-0 sudo[233652]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v590: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:16 compute-0 sudo[233804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncqbgusbtaxqmbzopefduutxzmbprjzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797175.9779513-1019-147177034135887/AnsiballZ_file.py'
Dec 03 21:26:16 compute-0 sudo[233804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:16 compute-0 python3.9[233806]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:16 compute-0 sudo[233804]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:17 compute-0 sudo[233956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jltqxavfaqtuvmnblsnvbhowezpxvbtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797176.796109-1019-73423606029765/AnsiballZ_file.py'
Dec 03 21:26:17 compute-0 sudo[233956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:17 compute-0 ceph-mon[75204]: pgmap v590: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:17 compute-0 python3.9[233958]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:17 compute-0 sudo[233956]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:17 compute-0 sudo[234108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwsfxtxspilbnpwozsbzoojkiwhelfji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797177.622594-1019-74969823420293/AnsiballZ_file.py'
Dec 03 21:26:17 compute-0 sudo[234108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v591: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:18 compute-0 python3.9[234110]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:18 compute-0 sudo[234108]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:18 compute-0 sudo[234260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjwjcxqdqcsycdkwclchacrmhmobuaiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797178.3715057-1019-226179340106727/AnsiballZ_file.py'
Dec 03 21:26:18 compute-0 sudo[234260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:18 compute-0 python3.9[234262]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:18 compute-0 sudo[234260]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:19 compute-0 ceph-mon[75204]: pgmap v591: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:19 compute-0 sudo[234423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phgmgchbatuxpxiibyuovchinwbcbudq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797179.1865098-1019-232608565434053/AnsiballZ_file.py'
Dec 03 21:26:19 compute-0 sudo[234423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:19 compute-0 podman[234386]: 2025-12-03 21:26:19.642277729 +0000 UTC m=+0.089074377 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:26:19 compute-0 python3.9[234430]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:19 compute-0 sudo[234423]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v592: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:26:21
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.meta', '.mgr', 'vms', 'volumes', 'images', 'backups', 'cephfs.cephfs.data']
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:26:21 compute-0 ceph-mon[75204]: pgmap v592: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:26:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:26:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v593: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:23 compute-0 ceph-mon[75204]: pgmap v593: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v594: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:25 compute-0 sudo[234595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvznxcjftnpavaawhrgrjjsaplntlbvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797184.802467-1208-61448385550184/AnsiballZ_getent.py'
Dec 03 21:26:25 compute-0 sudo[234595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:25 compute-0 podman[234557]: 2025-12-03 21:26:25.373239364 +0000 UTC m=+0.081277338 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:26:25 compute-0 ceph-mon[75204]: pgmap v594: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:25 compute-0 python3.9[234601]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 03 21:26:25 compute-0 sudo[234595]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v595: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:26 compute-0 sudo[234752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgmvcwwgyxusdceuzvvxysqatdjddsaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797185.8059118-1216-187679684311953/AnsiballZ_group.py'
Dec 03 21:26:26 compute-0 sudo[234752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:26 compute-0 python3.9[234754]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 03 21:26:26 compute-0 groupadd[234755]: group added to /etc/group: name=nova, GID=42436
Dec 03 21:26:26 compute-0 groupadd[234755]: group added to /etc/gshadow: name=nova
Dec 03 21:26:26 compute-0 groupadd[234755]: new group: name=nova, GID=42436
Dec 03 21:26:26 compute-0 sudo[234752]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:27 compute-0 ceph-mon[75204]: pgmap v595: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:27 compute-0 sudo[234910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqnfjwxutwjkegijxjzqhvwpxyrfmfcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797186.9560652-1224-103380709351324/AnsiballZ_user.py'
Dec 03 21:26:27 compute-0 sudo[234910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:27 compute-0 python3.9[234912]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 03 21:26:27 compute-0 useradd[234914]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 03 21:26:27 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:26:27 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:26:27 compute-0 useradd[234914]: add 'nova' to group 'libvirt'
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:26:27 compute-0 useradd[234914]: add 'nova' to shadow group 'libvirt'
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:26:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:26:27 compute-0 sudo[234910]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v596: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:28 compute-0 sshd-session[234946]: Accepted publickey for zuul from 192.168.122.30 port 53480 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:26:28 compute-0 systemd-logind[787]: New session 50 of user zuul.
Dec 03 21:26:28 compute-0 systemd[1]: Started Session 50 of User zuul.
Dec 03 21:26:28 compute-0 sshd-session[234946]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:26:28 compute-0 sshd-session[234949]: Received disconnect from 192.168.122.30 port 53480:11: disconnected by user
Dec 03 21:26:28 compute-0 sshd-session[234949]: Disconnected from user zuul 192.168.122.30 port 53480
Dec 03 21:26:28 compute-0 sshd-session[234946]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:26:28 compute-0 systemd[1]: session-50.scope: Deactivated successfully.
Dec 03 21:26:28 compute-0 systemd-logind[787]: Session 50 logged out. Waiting for processes to exit.
Dec 03 21:26:28 compute-0 systemd-logind[787]: Removed session 50.
Dec 03 21:26:29 compute-0 ceph-mon[75204]: pgmap v596: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:29 compute-0 python3.9[235099]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v597: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:30 compute-0 python3.9[235220]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764797189.1395714-1249-110186999234054/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:30 compute-0 python3.9[235370]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:31 compute-0 ceph-mon[75204]: pgmap v597: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:31 compute-0 python3.9[235446]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v598: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:32 compute-0 python3.9[235596]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:32 compute-0 python3.9[235717]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764797191.6330242-1249-84863402190052/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:33 compute-0 ceph-mon[75204]: pgmap v598: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:33 compute-0 python3.9[235867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v599: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:34 compute-0 python3.9[235988]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764797193.017465-1249-148636484996121/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:35 compute-0 python3.9[236138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:35 compute-0 ceph-mon[75204]: pgmap v599: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:35 compute-0 python3.9[236259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764797194.4852717-1249-120755419875994/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v600: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:36 compute-0 python3.9[236409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:37 compute-0 python3.9[236530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764797195.8696108-1249-260822366056509/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:37 compute-0 ceph-mon[75204]: pgmap v600: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:37 compute-0 sudo[236680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asuqiwkxiygvtxzlcpkhjnibqcicdotp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797197.3487978-1332-256161765720065/AnsiballZ_file.py'
Dec 03 21:26:37 compute-0 sudo[236680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:37 compute-0 python3.9[236682]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:26:37 compute-0 sudo[236680]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v601: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:38 compute-0 sudo[236832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqxhbvmvisavpuzcjdlhlayzgzwuswmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797198.185659-1340-96865867863843/AnsiballZ_copy.py'
Dec 03 21:26:38 compute-0 sudo[236832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:38 compute-0 python3.9[236834]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:26:38 compute-0 sudo[236832]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:39 compute-0 sudo[236984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpvahrprlaramnrcvluehtddvhdytxhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797199.0377789-1348-17771942863212/AnsiballZ_stat.py'
Dec 03 21:26:39 compute-0 sudo[236984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:39 compute-0 ceph-mon[75204]: pgmap v601: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:39 compute-0 python3.9[236986]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:26:39 compute-0 sudo[236984]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v602: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:40 compute-0 sudo[237136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldfajtjtbezqrfyzrmbhxyvblslutwtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797199.8087378-1356-222310726188555/AnsiballZ_stat.py'
Dec 03 21:26:40 compute-0 sudo[237136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:40 compute-0 python3.9[237138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:40 compute-0 sudo[237136]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:40 compute-0 sudo[237259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxcmkockubqyihlemgoryuqpkjmjnmzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797199.8087378-1356-222310726188555/AnsiballZ_copy.py'
Dec 03 21:26:40 compute-0 sudo[237259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:41 compute-0 python3.9[237261]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764797199.8087378-1356-222310726188555/.source _original_basename=.4c23dsph follow=False checksum=19000a86675532e29c6f41be5522228461db82b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 03 21:26:41 compute-0 sudo[237259]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:41 compute-0 ceph-mon[75204]: pgmap v602: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:41 compute-0 sudo[237311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:26:41 compute-0 sudo[237311]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:41 compute-0 sudo[237311]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:41 compute-0 sudo[237365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:26:41 compute-0 sudo[237365]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v603: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:42 compute-0 python3.9[237465]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:26:42 compute-0 sudo[237365]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:26:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:26:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:26:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:26:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:26:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:26:42 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:26:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:26:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:26:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:26:42 compute-0 sudo[237545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:26:42 compute-0 sudo[237545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:42 compute-0 sudo[237545]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:42 compute-0 sudo[237599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:26:42 compute-0 sudo[237599]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:42 compute-0 podman[237709]: 2025-12-03 21:26:42.981242366 +0000 UTC m=+0.067141450 container create d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_dirac, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:26:42 compute-0 python3.9[237697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:43 compute-0 systemd[1]: Started libpod-conmon-d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3.scope.
Dec 03 21:26:43 compute-0 podman[237709]: 2025-12-03 21:26:42.952836635 +0000 UTC m=+0.038735749 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:26:43 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:26:43 compute-0 podman[237709]: 2025-12-03 21:26:43.083547097 +0000 UTC m=+0.169446231 container init d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_dirac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 03 21:26:43 compute-0 podman[237709]: 2025-12-03 21:26:43.094491681 +0000 UTC m=+0.180390725 container start d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_dirac, ceph=True, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:26:43 compute-0 podman[237709]: 2025-12-03 21:26:43.097719737 +0000 UTC m=+0.183618871 container attach d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_dirac, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:26:43 compute-0 musing_dirac[237725]: 167 167
Dec 03 21:26:43 compute-0 systemd[1]: libpod-d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3.scope: Deactivated successfully.
Dec 03 21:26:43 compute-0 podman[237709]: 2025-12-03 21:26:43.103222545 +0000 UTC m=+0.189121619 container died d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_dirac, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Dec 03 21:26:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-c8921e90701ad8fbdfc9cd7ea4b160d5415373e4d516ddcaaa4c2d52ee25d39e-merged.mount: Deactivated successfully.
Dec 03 21:26:43 compute-0 podman[237709]: 2025-12-03 21:26:43.158194117 +0000 UTC m=+0.244093171 container remove d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_dirac, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3)
Dec 03 21:26:43 compute-0 systemd[1]: libpod-conmon-d77dad3f9996074bb99c1064362e954e3a1d3a2b44a7840162b68c78b8d592f3.scope: Deactivated successfully.
Dec 03 21:26:43 compute-0 podman[237817]: 2025-12-03 21:26:43.343965304 +0000 UTC m=+0.046243990 container create a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_chatelet, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2)
Dec 03 21:26:43 compute-0 systemd[1]: Started libpod-conmon-a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2.scope.
Dec 03 21:26:43 compute-0 podman[237817]: 2025-12-03 21:26:43.325437887 +0000 UTC m=+0.027716583 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:26:43 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:26:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d18325dfb2044c7dc4e353f3ff08951d4bc5e2fb9d5fe1c1827e2b1f47c9405/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d18325dfb2044c7dc4e353f3ff08951d4bc5e2fb9d5fe1c1827e2b1f47c9405/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d18325dfb2044c7dc4e353f3ff08951d4bc5e2fb9d5fe1c1827e2b1f47c9405/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d18325dfb2044c7dc4e353f3ff08951d4bc5e2fb9d5fe1c1827e2b1f47c9405/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d18325dfb2044c7dc4e353f3ff08951d4bc5e2fb9d5fe1c1827e2b1f47c9405/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:43 compute-0 podman[237817]: 2025-12-03 21:26:43.440939181 +0000 UTC m=+0.143217947 container init a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_chatelet, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:26:43 compute-0 podman[237817]: 2025-12-03 21:26:43.452250564 +0000 UTC m=+0.154529280 container start a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_chatelet, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:26:43 compute-0 podman[237817]: 2025-12-03 21:26:43.457253659 +0000 UTC m=+0.159532365 container attach a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_chatelet, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:26:43 compute-0 ceph-mon[75204]: pgmap v603: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:43 compute-0 python3.9[237891]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764797202.4094832-1382-85524402632217/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:43 compute-0 romantic_chatelet[237860]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:26:43 compute-0 romantic_chatelet[237860]: --> All data devices are unavailable
Dec 03 21:26:43 compute-0 systemd[1]: libpod-a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2.scope: Deactivated successfully.
Dec 03 21:26:43 compute-0 podman[237817]: 2025-12-03 21:26:43.973443297 +0000 UTC m=+0.675721993 container died a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_chatelet, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:26:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-4d18325dfb2044c7dc4e353f3ff08951d4bc5e2fb9d5fe1c1827e2b1f47c9405-merged.mount: Deactivated successfully.
Dec 03 21:26:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v604: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:44 compute-0 podman[237817]: 2025-12-03 21:26:44.038631564 +0000 UTC m=+0.740910250 container remove a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_chatelet, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:26:44 compute-0 systemd[1]: libpod-conmon-a4fe968f85186fcb66f796d63a65a264e4340847bb7eee2ff2803200b69a1da2.scope: Deactivated successfully.
Dec 03 21:26:44 compute-0 sudo[237599]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:44 compute-0 podman[237987]: 2025-12-03 21:26:44.149901895 +0000 UTC m=+0.129116220 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 21:26:44 compute-0 sudo[238054]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:26:44 compute-0 sudo[238054]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:44 compute-0 sudo[238054]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:44 compute-0 sudo[238107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:26:44 compute-0 sudo[238107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:44 compute-0 python3.9[238126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 03 21:26:44 compute-0 podman[238200]: 2025-12-03 21:26:44.631277261 +0000 UTC m=+0.069724409 container create 69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_hodgkin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:26:44 compute-0 systemd[1]: Started libpod-conmon-69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62.scope.
Dec 03 21:26:44 compute-0 podman[238200]: 2025-12-03 21:26:44.602285525 +0000 UTC m=+0.040732723 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:26:44 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:26:44 compute-0 podman[238200]: 2025-12-03 21:26:44.734183178 +0000 UTC m=+0.172630376 container init 69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_hodgkin, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0)
Dec 03 21:26:44 compute-0 podman[238200]: 2025-12-03 21:26:44.743440426 +0000 UTC m=+0.181887564 container start 69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_hodgkin, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:26:44 compute-0 podman[238200]: 2025-12-03 21:26:44.747404683 +0000 UTC m=+0.185851841 container attach 69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_hodgkin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:26:44 compute-0 serene_hodgkin[238240]: 167 167
Dec 03 21:26:44 compute-0 systemd[1]: libpod-69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62.scope: Deactivated successfully.
Dec 03 21:26:44 compute-0 podman[238200]: 2025-12-03 21:26:44.751068821 +0000 UTC m=+0.189515959 container died 69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_hodgkin, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:26:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-d82c22506a8c547adba70ba6cb52c438dba43b32a1543d84cfbab1f460fa50fb-merged.mount: Deactivated successfully.
Dec 03 21:26:44 compute-0 podman[238200]: 2025-12-03 21:26:44.797178865 +0000 UTC m=+0.235625983 container remove 69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_hodgkin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:26:44 compute-0 systemd[1]: libpod-conmon-69f80d291e77d52c327aadb75789cd451f94f8a04673125305795bc8e133ed62.scope: Deactivated successfully.
Dec 03 21:26:45 compute-0 podman[238314]: 2025-12-03 21:26:45.007297284 +0000 UTC m=+0.062127975 container create 8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=keen_khayyam, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:26:45 compute-0 python3.9[238307]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764797203.8662555-1397-75248352782967/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 03 21:26:45 compute-0 systemd[1]: Started libpod-conmon-8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc.scope.
Dec 03 21:26:45 compute-0 podman[238314]: 2025-12-03 21:26:44.976479649 +0000 UTC m=+0.031310350 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:26:45 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:26:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68876c974428ba1116d946dddf9b9b3f5eca22e7b7151ed243fd911faa852b87/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68876c974428ba1116d946dddf9b9b3f5eca22e7b7151ed243fd911faa852b87/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68876c974428ba1116d946dddf9b9b3f5eca22e7b7151ed243fd911faa852b87/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68876c974428ba1116d946dddf9b9b3f5eca22e7b7151ed243fd911faa852b87/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:45 compute-0 podman[238314]: 2025-12-03 21:26:45.128971874 +0000 UTC m=+0.183802605 container init 8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=keen_khayyam, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:26:45 compute-0 podman[238314]: 2025-12-03 21:26:45.141655964 +0000 UTC m=+0.196486625 container start 8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=keen_khayyam, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:26:45 compute-0 podman[238314]: 2025-12-03 21:26:45.145431715 +0000 UTC m=+0.200262406 container attach 8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=keen_khayyam, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:26:45 compute-0 ceph-mon[75204]: pgmap v604: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:45 compute-0 keen_khayyam[238332]: {
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:     "0": [
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:         {
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "devices": [
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "/dev/loop3"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             ],
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_name": "ceph_lv0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_size": "21470642176",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "name": "ceph_lv0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "tags": {
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cluster_name": "ceph",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.crush_device_class": "",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.encrypted": "0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.objectstore": "bluestore",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osd_id": "0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.type": "block",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.vdo": "0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.with_tpm": "0"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             },
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "type": "block",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "vg_name": "ceph_vg0"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:         }
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:     ],
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:     "1": [
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:         {
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "devices": [
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "/dev/loop4"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             ],
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_name": "ceph_lv1",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_size": "21470642176",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "name": "ceph_lv1",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "tags": {
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cluster_name": "ceph",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.crush_device_class": "",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.encrypted": "0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.objectstore": "bluestore",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osd_id": "1",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.type": "block",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.vdo": "0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.with_tpm": "0"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             },
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "type": "block",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "vg_name": "ceph_vg1"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:         }
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:     ],
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:     "2": [
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:         {
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "devices": [
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "/dev/loop5"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             ],
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_name": "ceph_lv2",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_size": "21470642176",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "name": "ceph_lv2",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "tags": {
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.cluster_name": "ceph",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.crush_device_class": "",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.encrypted": "0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.objectstore": "bluestore",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osd_id": "2",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.type": "block",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.vdo": "0",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:                 "ceph.with_tpm": "0"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             },
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "type": "block",
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:             "vg_name": "ceph_vg2"
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:         }
Dec 03 21:26:45 compute-0 keen_khayyam[238332]:     ]
Dec 03 21:26:45 compute-0 keen_khayyam[238332]: }
Dec 03 21:26:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:45 compute-0 systemd[1]: libpod-8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc.scope: Deactivated successfully.
Dec 03 21:26:45 compute-0 podman[238314]: 2025-12-03 21:26:45.541986399 +0000 UTC m=+0.596817090 container died 8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=keen_khayyam, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:26:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-68876c974428ba1116d946dddf9b9b3f5eca22e7b7151ed243fd911faa852b87-merged.mount: Deactivated successfully.
Dec 03 21:26:45 compute-0 podman[238314]: 2025-12-03 21:26:45.601881664 +0000 UTC m=+0.656712315 container remove 8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=keen_khayyam, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:26:45 compute-0 systemd[1]: libpod-conmon-8337d83a3b20511f9491db1eff44f848c238f9aca2ca371d00ed0fe9e00d41dc.scope: Deactivated successfully.
Dec 03 21:26:45 compute-0 sudo[238107]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:45 compute-0 sudo[238464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:26:45 compute-0 sudo[238464]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:45 compute-0 sudo[238464]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:45 compute-0 sudo[238533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nquhulfjdxtcdwabeciepvkgvplrwzuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797205.431621-1414-199381863262465/AnsiballZ_container_config_data.py'
Dec 03 21:26:45 compute-0 sudo[238533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:45 compute-0 sudo[238521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:26:45 compute-0 sudo[238521]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:45 compute-0 python3.9[238551]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 03 21:26:45 compute-0 sudo[238533]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v605: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:46 compute-0 podman[238566]: 2025-12-03 21:26:46.112211196 +0000 UTC m=+0.037925708 container create d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_nash, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default)
Dec 03 21:26:46 compute-0 systemd[1]: Started libpod-conmon-d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241.scope.
Dec 03 21:26:46 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:26:46 compute-0 podman[238566]: 2025-12-03 21:26:46.175051499 +0000 UTC m=+0.100766031 container init d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_nash, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:26:46 compute-0 podman[238566]: 2025-12-03 21:26:46.180479374 +0000 UTC m=+0.106193876 container start d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_nash, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:26:46 compute-0 podman[238566]: 2025-12-03 21:26:46.18408157 +0000 UTC m=+0.109796102 container attach d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_nash, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:26:46 compute-0 nifty_nash[238605]: 167 167
Dec 03 21:26:46 compute-0 systemd[1]: libpod-d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241.scope: Deactivated successfully.
Dec 03 21:26:46 compute-0 podman[238566]: 2025-12-03 21:26:46.185940181 +0000 UTC m=+0.111654713 container died d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_nash, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:26:46 compute-0 podman[238566]: 2025-12-03 21:26:46.096091254 +0000 UTC m=+0.021805776 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:26:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-674c565abb1d8ca3a91d8e5e3b5d7e10a3afe006930557859088e6b4f9ed1761-merged.mount: Deactivated successfully.
Dec 03 21:26:46 compute-0 podman[238566]: 2025-12-03 21:26:46.232777396 +0000 UTC m=+0.158491938 container remove d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nifty_nash, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 03 21:26:46 compute-0 systemd[1]: libpod-conmon-d590af1934cdc2100ed46b59eab06d17a1d96eba7f2b1aa4ac08c97382012241.scope: Deactivated successfully.
Dec 03 21:26:46 compute-0 podman[238687]: 2025-12-03 21:26:46.469467906 +0000 UTC m=+0.069764590 container create 9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=admiring_lamport, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:26:46 compute-0 systemd[1]: Started libpod-conmon-9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64.scope.
Dec 03 21:26:46 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:26:46 compute-0 podman[238687]: 2025-12-03 21:26:46.443288185 +0000 UTC m=+0.043584949 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:26:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39015a02d201a1586ab9c00dd22aad65b783a5f1e21115adedb1c1ca10fe57b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39015a02d201a1586ab9c00dd22aad65b783a5f1e21115adedb1c1ca10fe57b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39015a02d201a1586ab9c00dd22aad65b783a5f1e21115adedb1c1ca10fe57b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e39015a02d201a1586ab9c00dd22aad65b783a5f1e21115adedb1c1ca10fe57b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:26:46 compute-0 podman[238687]: 2025-12-03 21:26:46.552599724 +0000 UTC m=+0.152896438 container init 9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=admiring_lamport, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:26:46 compute-0 podman[238687]: 2025-12-03 21:26:46.558613425 +0000 UTC m=+0.158910109 container start 9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=admiring_lamport, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:26:46 compute-0 podman[238687]: 2025-12-03 21:26:46.561687057 +0000 UTC m=+0.161983741 container attach 9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=admiring_lamport, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030)
Dec 03 21:26:46 compute-0 sudo[238776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjicamfsoaqxplqcgcmunxusooevqbhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797206.2623568-1423-113564394914830/AnsiballZ_container_config_hash.py'
Dec 03 21:26:46 compute-0 sudo[238776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:46 compute-0 python3.9[238778]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 03 21:26:46 compute-0 sudo[238776]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:47 compute-0 lvm[238923]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:26:47 compute-0 lvm[238923]: VG ceph_vg1 finished
Dec 03 21:26:47 compute-0 lvm[238921]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:26:47 compute-0 lvm[238921]: VG ceph_vg0 finished
Dec 03 21:26:47 compute-0 lvm[238931]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:26:47 compute-0 lvm[238931]: VG ceph_vg2 finished
Dec 03 21:26:47 compute-0 lvm[238937]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:26:47 compute-0 lvm[238937]: VG ceph_vg1 finished
Dec 03 21:26:47 compute-0 lvm[238957]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:26:47 compute-0 lvm[238957]: VG ceph_vg1 finished
Dec 03 21:26:47 compute-0 admiring_lamport[238744]: {}
Dec 03 21:26:47 compute-0 systemd[1]: libpod-9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64.scope: Deactivated successfully.
Dec 03 21:26:47 compute-0 podman[238687]: 2025-12-03 21:26:47.40724977 +0000 UTC m=+1.007546444 container died 9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=admiring_lamport, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:26:47 compute-0 systemd[1]: libpod-9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64.scope: Consumed 1.351s CPU time.
Dec 03 21:26:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-e39015a02d201a1586ab9c00dd22aad65b783a5f1e21115adedb1c1ca10fe57b-merged.mount: Deactivated successfully.
Dec 03 21:26:47 compute-0 podman[238687]: 2025-12-03 21:26:47.451051533 +0000 UTC m=+1.051348207 container remove 9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=admiring_lamport, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 03 21:26:47 compute-0 systemd[1]: libpod-conmon-9cb07886dee851a9de5c66cb78a119c21a2f40db861ba3c95456452fb8ce1a64.scope: Deactivated successfully.
Dec 03 21:26:47 compute-0 sudo[239019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fivbihknzmkbdgogmeeggrlpoacvrbzi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764797207.1865838-1433-184945066426353/AnsiballZ_edpm_container_manage.py'
Dec 03 21:26:47 compute-0 sudo[239019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:47 compute-0 ceph-mon[75204]: pgmap v605: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:47 compute-0 sudo[238521]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:26:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:26:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:26:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:26:47 compute-0 sudo[239022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:26:47 compute-0 sudo[239022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:26:47 compute-0 sudo[239022]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:47 compute-0 python3[239021]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 03 21:26:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v606: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:26:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:26:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:26:48.930 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:26:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:26:48.931 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:26:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:26:48.931 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:26:49 compute-0 ceph-mon[75204]: pgmap v606: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v607: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:51 compute-0 ceph-mon[75204]: pgmap v607: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:26:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:26:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:26:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:26:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:26:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:26:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v608: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:53 compute-0 podman[239100]: 2025-12-03 21:26:53.463634921 +0000 UTC m=+3.440159644 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:26:54 compute-0 ceph-mon[75204]: pgmap v608: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v609: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v610: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:26:57 compute-0 ceph-mon[75204]: pgmap v609: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:57 compute-0 podman[239140]: 2025-12-03 21:26:57.600790686 +0000 UTC m=+1.532697612 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 03 21:26:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v611: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:58 compute-0 podman[239061]: 2025-12-03 21:26:58.102276741 +0000 UTC m=+10.268996799 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 03 21:26:58 compute-0 podman[239189]: 2025-12-03 21:26:58.329842918 +0000 UTC m=+0.098252334 container create 5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251125)
Dec 03 21:26:58 compute-0 podman[239189]: 2025-12-03 21:26:58.257763466 +0000 UTC m=+0.026172952 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 03 21:26:58 compute-0 python3[239021]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 03 21:26:58 compute-0 sudo[239019]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:58 compute-0 ceph-mon[75204]: pgmap v610: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:26:59 compute-0 sudo[239378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-andqyxrporoyvkrtstxvxenmhpqqqsgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797218.706622-1441-225103647767876/AnsiballZ_stat.py'
Dec 03 21:26:59 compute-0 sudo[239378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:26:59 compute-0 python3.9[239380]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:26:59 compute-0 sudo[239378]: pam_unix(sudo:session): session closed for user root
Dec 03 21:26:59 compute-0 ceph-mon[75204]: pgmap v611: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v612: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:00 compute-0 sudo[239532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjwyiewidcbkutvujjnkvaqvisbgeuqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797219.7479868-1453-133661257051153/AnsiballZ_container_config_data.py'
Dec 03 21:27:00 compute-0 sudo[239532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:00 compute-0 python3.9[239534]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 03 21:27:00 compute-0 sudo[239532]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:00 compute-0 sudo[239684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zopbuvxadqwfshimkcmrdhwcrtlohcah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797220.5491402-1462-233837379877795/AnsiballZ_container_config_hash.py'
Dec 03 21:27:00 compute-0 sudo[239684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:01 compute-0 ceph-mon[75204]: pgmap v612: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:01 compute-0 python3.9[239686]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 03 21:27:01 compute-0 sudo[239684]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:01 compute-0 sudo[239836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysqyvubvepblytzotsbylvpotpkhptrm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764797221.5560598-1472-226506325431367/AnsiballZ_edpm_container_manage.py'
Dec 03 21:27:01 compute-0 sudo[239836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v613: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:02 compute-0 python3[239838]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 03 21:27:02 compute-0 podman[239874]: 2025-12-03 21:27:02.548832125 +0000 UTC m=+0.068783493 container create 6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 21:27:02 compute-0 podman[239874]: 2025-12-03 21:27:02.511526436 +0000 UTC m=+0.031477864 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 03 21:27:02 compute-0 python3[239838]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 03 21:27:02 compute-0 sudo[239836]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:03 compute-0 ceph-mon[75204]: pgmap v613: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:03 compute-0 sudo[240063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shlxrwhxgqmogrifiiqwrfwdjumbammb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797222.9148114-1480-210494220366566/AnsiballZ_stat.py'
Dec 03 21:27:03 compute-0 sudo[240063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:03 compute-0 python3.9[240065]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:27:03 compute-0 sudo[240063]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v614: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:04 compute-0 sudo[240218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwodcffjozeptdyzbjfwrxgawcrehaue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797223.7325094-1489-146169705168541/AnsiballZ_file.py'
Dec 03 21:27:04 compute-0 sudo[240218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:04 compute-0 python3.9[240220]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:27:04 compute-0 sudo[240218]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:04 compute-0 sudo[240370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqagdyvnvujzxsgziclhbizidppsnncg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797224.4154875-1489-193927313095374/AnsiballZ_copy.py'
Dec 03 21:27:04 compute-0 sudo[240370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:05 compute-0 python3.9[240372]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764797224.4154875-1489-193927313095374/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 03 21:27:05 compute-0 sudo[240370]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:05 compute-0 ceph-mon[75204]: pgmap v614: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:05 compute-0 sudo[240446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngaviovitevrgiymckxugbipdqkrvxsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797224.4154875-1489-193927313095374/AnsiballZ_systemd.py'
Dec 03 21:27:05 compute-0 sudo[240446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:05 compute-0 python3.9[240448]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 03 21:27:05 compute-0 systemd[1]: Reloading.
Dec 03 21:27:05 compute-0 systemd-sysv-generator[240479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:27:05 compute-0 systemd-rc-local-generator[240476]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:27:06 compute-0 sudo[240446]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v615: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:06 compute-0 sudo[240557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssgyjhxkugiimahbmvlpjbyzararrlkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797224.4154875-1489-193927313095374/AnsiballZ_systemd.py'
Dec 03 21:27:06 compute-0 sudo[240557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:06 compute-0 sshd-session[240151]: Connection reset by authenticating user root 45.140.17.124 port 53922 [preauth]
Dec 03 21:27:06 compute-0 python3.9[240559]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 03 21:27:06 compute-0 systemd[1]: Reloading.
Dec 03 21:27:06 compute-0 systemd-rc-local-generator[240591]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 03 21:27:06 compute-0 systemd-sysv-generator[240596]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 03 21:27:07 compute-0 ceph-mon[75204]: pgmap v615: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:07 compute-0 systemd[1]: Starting nova_compute container...
Dec 03 21:27:07 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:07 compute-0 podman[240603]: 2025-12-03 21:27:07.393622618 +0000 UTC m=+0.144115902 container init 6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 03 21:27:07 compute-0 podman[240603]: 2025-12-03 21:27:07.405907248 +0000 UTC m=+0.156400542 container start 6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:27:07 compute-0 podman[240603]: nova_compute
Dec 03 21:27:07 compute-0 nova_compute[240618]: + sudo -E kolla_set_configs
Dec 03 21:27:07 compute-0 systemd[1]: Started nova_compute container.
Dec 03 21:27:07 compute-0 sudo[240557]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Validating config file
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying service configuration files
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Deleting /etc/ceph
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Creating directory /etc/ceph
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/ceph
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Writing out command to execute
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 03 21:27:07 compute-0 nova_compute[240618]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 03 21:27:07 compute-0 nova_compute[240618]: ++ cat /run_command
Dec 03 21:27:07 compute-0 nova_compute[240618]: + CMD=nova-compute
Dec 03 21:27:07 compute-0 nova_compute[240618]: + ARGS=
Dec 03 21:27:07 compute-0 nova_compute[240618]: + sudo kolla_copy_cacerts
Dec 03 21:27:07 compute-0 nova_compute[240618]: + [[ ! -n '' ]]
Dec 03 21:27:07 compute-0 nova_compute[240618]: + . kolla_extend_start
Dec 03 21:27:07 compute-0 nova_compute[240618]: Running command: 'nova-compute'
Dec 03 21:27:07 compute-0 nova_compute[240618]: + echo 'Running command: '\''nova-compute'\'''
Dec 03 21:27:07 compute-0 nova_compute[240618]: + umask 0022
Dec 03 21:27:07 compute-0 nova_compute[240618]: + exec nova-compute
Dec 03 21:27:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v616: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:08 compute-0 python3.9[240779]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:27:08 compute-0 sshd-session[240560]: Connection reset by authenticating user root 45.140.17.124 port 53932 [preauth]
Dec 03 21:27:09 compute-0 ceph-mon[75204]: pgmap v616: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:09 compute-0 python3.9[240930]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:27:09 compute-0 nova_compute[240618]: 2025-12-03 21:27:09.658 240622 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 03 21:27:09 compute-0 nova_compute[240618]: 2025-12-03 21:27:09.658 240622 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 03 21:27:09 compute-0 nova_compute[240618]: 2025-12-03 21:27:09.658 240622 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 03 21:27:09 compute-0 nova_compute[240618]: 2025-12-03 21:27:09.659 240622 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 03 21:27:09 compute-0 nova_compute[240618]: 2025-12-03 21:27:09.785 240622 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:27:09 compute-0 nova_compute[240618]: 2025-12-03 21:27:09.814 240622 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:27:09 compute-0 nova_compute[240618]: 2025-12-03 21:27:09.815 240622 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 03 21:27:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v617: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:10 compute-0 python3.9[241086]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.491 240622 INFO nova.virt.driver [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.662 240622 INFO nova.compute.provider_config [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.683 240622 DEBUG oslo_concurrency.lockutils [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.684 240622 DEBUG oslo_concurrency.lockutils [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.684 240622 DEBUG oslo_concurrency.lockutils [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.684 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.684 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.685 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.685 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.685 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.685 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.685 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.685 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.685 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.686 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.686 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.686 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.686 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.686 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.686 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.686 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.687 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.687 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.687 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.687 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.687 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.687 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.687 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.688 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.688 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.688 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.688 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.688 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.688 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.688 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.689 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.689 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.689 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.689 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.689 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.689 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.689 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.689 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.690 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.690 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.690 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.690 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.690 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.690 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.691 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.691 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.691 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.691 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.691 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.691 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.691 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.692 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.692 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.692 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.692 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.692 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.692 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.692 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.693 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.693 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.693 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.693 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.693 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.693 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.693 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.693 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.694 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.694 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.694 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.694 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.694 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.694 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.694 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.694 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.695 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.695 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.695 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.695 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.695 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.695 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.696 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.696 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.696 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.696 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.696 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.696 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.696 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.696 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.697 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.697 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.697 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.697 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.697 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.697 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.697 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.697 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.698 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.698 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.698 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.698 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.698 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.698 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.698 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.699 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.699 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.699 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.699 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.699 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.699 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.699 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.699 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.700 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.700 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.700 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.700 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.700 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.700 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.700 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.701 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.701 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.701 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.701 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.701 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.701 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.701 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.701 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.702 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.702 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.702 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.702 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.702 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.702 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.702 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.703 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.703 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.703 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.703 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.703 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.703 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.703 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.703 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.704 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.704 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.704 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.704 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.704 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.704 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.704 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.705 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.705 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.705 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.705 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.705 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.705 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.705 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.706 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.706 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.706 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.706 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.706 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.706 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.706 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.707 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.707 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.707 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.707 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.707 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.708 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.708 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.708 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.708 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.708 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.708 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.708 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.708 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.709 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.709 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.709 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.709 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.709 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.709 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.709 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.710 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.710 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.710 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.710 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.710 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.710 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.710 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.711 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.711 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.711 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.711 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.711 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.711 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.711 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.712 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.712 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.712 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.712 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.712 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.712 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.712 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.712 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.713 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.713 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.713 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.713 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.713 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.713 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.713 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.714 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.714 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.714 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.714 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.714 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.714 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.714 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.715 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.715 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.715 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.715 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.715 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.715 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.715 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.716 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.716 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.716 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.716 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.716 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.716 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.716 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.716 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.717 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.717 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.717 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.717 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.717 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.717 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.717 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.718 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.718 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.718 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.718 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.718 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.718 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.718 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.719 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.719 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.719 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.719 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.719 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.719 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.719 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.720 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.720 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.720 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.720 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.720 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.720 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.720 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.721 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.721 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.721 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.721 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.721 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.721 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.721 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.721 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.722 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.722 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.722 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.722 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.722 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.722 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.722 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.723 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.723 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.723 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.723 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.723 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.723 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.723 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.724 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.724 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.724 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.724 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.724 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.724 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.724 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.725 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.725 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.725 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.725 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.725 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.725 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.725 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.726 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.726 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.726 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.726 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.726 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.726 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.726 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.727 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.727 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.727 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.727 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.727 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.727 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.727 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.728 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.728 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.728 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.728 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.728 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.728 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.728 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.729 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.729 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.729 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.729 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.729 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.729 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.729 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.730 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.730 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.730 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.730 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.730 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.730 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.730 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.731 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.731 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.731 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.731 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.731 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.731 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.731 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.732 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.732 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.732 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.732 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.732 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.732 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.732 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.733 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.733 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.733 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.733 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.733 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.733 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.733 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.733 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.734 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.734 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.734 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.734 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.734 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.735 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.735 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.735 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.735 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.735 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.735 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.735 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.736 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.736 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.736 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.736 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.736 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.736 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.736 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.736 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.737 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.737 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.737 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.737 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.737 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.737 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.737 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.738 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.738 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.738 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.738 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.738 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.738 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.738 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.739 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.739 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.739 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.739 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.739 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.739 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.739 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.740 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.740 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.740 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.740 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.740 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.740 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.740 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.741 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.741 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.741 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.741 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.741 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.741 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.741 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.742 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.742 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.742 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.742 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.742 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.742 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.742 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.743 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.743 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.743 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.743 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.743 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.743 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.743 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.744 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.744 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.744 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.744 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.744 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.744 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.745 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.745 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.745 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.745 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.745 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.745 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.745 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.746 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.746 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.746 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.746 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.746 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.746 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.746 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.747 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.747 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.747 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.747 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.747 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.747 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.747 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.748 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.748 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.748 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.748 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.748 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.748 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.748 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.749 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.749 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.749 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.749 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.749 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.749 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.749 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.750 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.750 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.750 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.750 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.750 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.750 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.751 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.751 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.751 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.751 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.751 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.751 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.751 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.752 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.752 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.752 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.752 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.752 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.752 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.752 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.753 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.753 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.753 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.753 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.753 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.753 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.753 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.754 240622 WARNING oslo_config.cfg [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 03 21:27:10 compute-0 nova_compute[240618]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 03 21:27:10 compute-0 nova_compute[240618]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 03 21:27:10 compute-0 nova_compute[240618]: and ``live_migration_inbound_addr`` respectively.
Dec 03 21:27:10 compute-0 nova_compute[240618]: ).  Its value may be silently ignored in the future.
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.754 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.754 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.754 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.754 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.754 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.755 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.755 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.755 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.755 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.755 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.755 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.755 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.756 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.756 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.756 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.756 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.756 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.756 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.756 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rbd_secret_uuid        = c21de27e-a7fd-594b-8324-0697ba9aab3a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.757 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.757 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.757 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.757 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.757 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.757 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.757 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.758 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.758 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.758 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.758 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.758 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.758 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.758 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.759 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.759 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.759 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.759 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.759 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.759 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.760 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.760 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.760 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.760 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.760 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.760 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.760 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.761 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.761 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.761 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.761 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.761 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.761 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.761 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.762 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.762 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.762 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.762 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.762 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.762 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.762 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.762 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.763 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.763 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.763 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.763 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.763 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.763 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.763 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.764 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.764 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.764 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.764 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.764 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.764 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.764 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.765 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.765 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.765 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.765 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.765 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.765 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.765 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.765 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.766 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.766 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.766 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.766 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.766 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.766 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.766 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.767 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.767 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.767 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.767 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.767 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.767 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.767 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.768 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.768 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.768 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.768 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.768 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.768 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.768 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.768 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.769 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.769 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.769 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.769 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.769 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.769 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.769 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.770 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.770 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.770 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.770 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.770 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.770 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.770 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.770 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.771 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.771 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.771 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.771 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.771 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.771 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.771 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.772 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.772 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.772 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.772 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.772 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.772 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.772 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.773 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.773 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.773 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.773 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.773 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.773 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.773 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.774 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.774 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.774 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.774 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.774 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.774 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.774 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.775 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.775 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.775 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.775 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.775 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.775 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.775 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.776 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.776 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.776 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.776 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.776 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.776 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.777 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.777 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.777 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.777 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.777 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.777 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.777 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.778 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.778 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.778 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.778 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.778 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.778 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.778 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.779 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.779 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.779 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.779 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.779 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.779 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.779 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.780 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.780 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.780 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.780 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.780 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.780 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.780 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.781 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.781 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.781 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.781 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.781 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.781 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.781 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.782 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.782 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.782 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.782 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.782 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.782 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.782 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.783 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.783 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.783 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.783 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.783 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.783 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.783 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.784 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.784 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.784 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.784 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.784 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.784 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.784 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.784 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.785 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.785 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.785 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.785 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.785 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.785 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.785 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.786 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.786 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.786 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.786 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.786 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.786 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.786 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.786 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.787 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.787 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.787 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.787 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.787 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.787 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.787 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.788 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.788 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.788 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.788 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.788 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.788 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.788 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.788 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.789 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.789 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.789 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.789 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.789 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.789 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.790 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.790 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.790 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.790 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.790 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.790 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.790 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.791 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.791 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.791 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.791 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.791 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.791 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.791 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.791 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.792 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.792 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.792 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.792 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.792 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.792 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.792 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.793 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.793 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.793 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.793 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.793 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.793 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.793 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.794 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.794 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.794 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.794 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.794 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.794 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.794 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.795 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.795 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.795 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.795 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.795 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.795 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.795 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.796 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.796 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.796 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.796 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.796 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.796 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.796 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.797 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.797 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.797 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.797 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.797 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.797 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.797 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.797 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.798 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.798 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.798 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.798 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.798 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.798 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.798 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.799 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.799 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.799 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.799 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.799 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.799 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.799 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.800 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.800 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.800 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.800 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.800 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.800 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.800 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.801 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.801 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.801 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.801 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.801 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.801 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.801 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.802 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.802 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.802 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.802 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.802 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.802 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.802 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.803 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.803 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.803 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.803 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.803 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.803 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.803 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.803 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.804 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.804 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.804 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.804 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.804 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.804 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.804 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.805 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.805 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.805 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.805 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.805 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.805 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.805 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.805 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.806 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.806 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.806 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.806 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.806 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.806 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.806 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.807 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.807 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.807 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.807 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.807 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.807 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.807 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.808 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.808 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.808 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.808 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.808 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.808 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.808 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.808 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.809 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.809 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.809 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.809 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.809 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.810 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.810 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.810 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.810 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.810 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.810 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.810 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.811 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.811 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.811 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.811 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.811 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.811 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.811 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.812 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.812 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.812 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.812 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.812 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.812 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.812 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.813 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.813 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.813 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.813 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.813 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.813 240622 DEBUG oslo_service.service [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.814 240622 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.831 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.832 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.832 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.832 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 03 21:27:10 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 03 21:27:10 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.898 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f238353aaf0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.901 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f238353aaf0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.902 240622 INFO nova.virt.libvirt.driver [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Connection event '1' reason 'None'
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.926 240622 WARNING nova.virt.libvirt.driver [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 03 21:27:10 compute-0 nova_compute[240618]: 2025-12-03 21:27:10.926 240622 DEBUG nova.virt.libvirt.volume.mount [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 03 21:27:10 compute-0 sshd-session[240931]: Connection reset by authenticating user root 45.140.17.124 port 50044 [preauth]
Dec 03 21:27:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:11 compute-0 ceph-mon[75204]: pgmap v617: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:11 compute-0 sudo[241289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lglubgfsyhxiuemvbyniuzcxblpxmmvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797230.4587262-1549-168032724922556/AnsiballZ_podman_container.py'
Dec 03 21:27:11 compute-0 sudo[241289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:11 compute-0 python3.9[241291]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 03 21:27:11 compute-0 sudo[241289]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:11 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:27:11 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:27:11 compute-0 nova_compute[240618]: 2025-12-03 21:27:11.892 240622 INFO nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Libvirt host capabilities <capabilities>
Dec 03 21:27:11 compute-0 nova_compute[240618]: 
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <host>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <uuid>fe808748-0a27-4a3c-9875-a9777da5fa17</uuid>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <cpu>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <arch>x86_64</arch>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model>EPYC-Rome-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <vendor>AMD</vendor>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <microcode version='16777317'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <signature family='23' model='49' stepping='0'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='x2apic'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='tsc-deadline'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='osxsave'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='hypervisor'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='tsc_adjust'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='spec-ctrl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='stibp'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='arch-capabilities'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='ssbd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='cmp_legacy'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='topoext'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='virt-ssbd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='lbrv'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='tsc-scale'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='vmcb-clean'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='pause-filter'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='pfthreshold'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='svme-addr-chk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='rdctl-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='skip-l1dfl-vmentry'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='mds-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature name='pschange-mc-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <pages unit='KiB' size='4'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <pages unit='KiB' size='2048'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <pages unit='KiB' size='1048576'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </cpu>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <power_management>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <suspend_mem/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </power_management>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <iommu support='no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <migration_features>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <live/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <uri_transports>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <uri_transport>tcp</uri_transport>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <uri_transport>rdma</uri_transport>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </uri_transports>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </migration_features>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <topology>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <cells num='1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <cell id='0'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:           <memory unit='KiB'>7864316</memory>
Dec 03 21:27:11 compute-0 nova_compute[240618]:           <pages unit='KiB' size='4'>1966079</pages>
Dec 03 21:27:11 compute-0 nova_compute[240618]:           <pages unit='KiB' size='2048'>0</pages>
Dec 03 21:27:11 compute-0 nova_compute[240618]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 03 21:27:11 compute-0 nova_compute[240618]:           <distances>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <sibling id='0' value='10'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:           </distances>
Dec 03 21:27:11 compute-0 nova_compute[240618]:           <cpus num='8'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:           </cpus>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         </cell>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </cells>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </topology>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <cache>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </cache>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <secmodel>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model>selinux</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <doi>0</doi>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </secmodel>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <secmodel>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model>dac</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <doi>0</doi>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </secmodel>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   </host>
Dec 03 21:27:11 compute-0 nova_compute[240618]: 
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <guest>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <os_type>hvm</os_type>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <arch name='i686'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <wordsize>32</wordsize>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <domain type='qemu'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <domain type='kvm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </arch>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <features>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <pae/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <nonpae/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <acpi default='on' toggle='yes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <apic default='on' toggle='no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <cpuselection/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <deviceboot/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <disksnapshot default='on' toggle='no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <externalSnapshot/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </features>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   </guest>
Dec 03 21:27:11 compute-0 nova_compute[240618]: 
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <guest>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <os_type>hvm</os_type>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <arch name='x86_64'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <wordsize>64</wordsize>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <domain type='qemu'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <domain type='kvm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </arch>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <features>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <acpi default='on' toggle='yes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <apic default='on' toggle='no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <cpuselection/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <deviceboot/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <disksnapshot default='on' toggle='no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <externalSnapshot/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </features>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   </guest>
Dec 03 21:27:11 compute-0 nova_compute[240618]: 
Dec 03 21:27:11 compute-0 nova_compute[240618]: </capabilities>
Dec 03 21:27:11 compute-0 nova_compute[240618]: 
Dec 03 21:27:11 compute-0 nova_compute[240618]: 2025-12-03 21:27:11.904 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 03 21:27:11 compute-0 nova_compute[240618]: 2025-12-03 21:27:11.937 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 03 21:27:11 compute-0 nova_compute[240618]: <domainCapabilities>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <path>/usr/libexec/qemu-kvm</path>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <domain>kvm</domain>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <arch>i686</arch>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <vcpu max='4096'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <iothreads supported='yes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <os supported='yes'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <enum name='firmware'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <loader supported='yes'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>rom</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>pflash</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <enum name='readonly'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>yes</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>no</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <enum name='secure'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>no</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </loader>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   </os>
Dec 03 21:27:11 compute-0 nova_compute[240618]:   <cpu>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <mode name='host-passthrough' supported='yes'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <enum name='hostPassthroughMigratable'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>on</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>off</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <mode name='maximum' supported='yes'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <enum name='maximumMigratable'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>on</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <value>off</value>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <mode name='host-model' supported='yes'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <vendor>AMD</vendor>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='x2apic'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc-deadline'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='hypervisor'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc_adjust'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='spec-ctrl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='stibp'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='ssbd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='cmp_legacy'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='overflow-recov'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='succor'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='ibrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='amd-ssbd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='virt-ssbd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='lbrv'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc-scale'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='vmcb-clean'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='flushbyasid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='pause-filter'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='pfthreshold'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='svme-addr-chk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <feature policy='disable' name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:11 compute-0 nova_compute[240618]:     <mode name='custom' supported='yes'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Broadwell'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Broadwell-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Broadwell-noTSX'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v5'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cooperlake'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cooperlake-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Cooperlake-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Denverton'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Denverton-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Denverton-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Denverton-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Dhyana-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Genoa'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='auto-ibrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Genoa-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='auto-ibrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='EPYC-v4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx10'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx10-128'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx10-256'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx10-512'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Haswell'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Haswell-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Haswell-noTSX'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Haswell-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Haswell-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Haswell-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Haswell-v4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-noTSX'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v5'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v6'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v7'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='IvyBridge'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='KnightsMill'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512er'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512pf'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='KnightsMill-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512er'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512pf'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Opteron_G4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Opteron_G4-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Opteron_G5'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tbm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Opteron_G5-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tbm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='SierraForest'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cmpccxadd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='SierraForest-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-ifma'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cmpccxadd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v5'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Snowridge'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v2'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v3'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v4'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='athlon'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='athlon-v1'>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 03 21:27:11 compute-0 nova_compute[240618]:       <blockers model='core2duo'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='core2duo-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='coreduo'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='coreduo-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='n270'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='n270-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='phenom'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='phenom-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </cpu>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <memoryBacking supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <enum name='sourceType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>file</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>anonymous</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>memfd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </memoryBacking>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <devices>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <disk supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='diskDevice'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>disk</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>cdrom</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>floppy</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>lun</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='bus'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>fdc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>scsi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>sata</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-non-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </disk>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <graphics supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vnc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>egl-headless</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dbus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </graphics>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <video supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='modelType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vga</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>cirrus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>none</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>bochs</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ramfb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </video>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <hostdev supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='mode'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>subsystem</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='startupPolicy'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>default</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>mandatory</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>requisite</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>optional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='subsysType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pci</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>scsi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='capsType'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='pciBackend'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </hostdev>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <rng supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-non-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>random</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>egd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>builtin</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </rng>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <filesystem supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='driverType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>path</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>handle</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtiofs</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </filesystem>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <tpm supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tpm-tis</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tpm-crb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>emulator</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>external</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendVersion'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>2.0</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </tpm>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <redirdev supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='bus'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </redirdev>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <channel supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pty</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>unix</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </channel>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <crypto supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>qemu</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>builtin</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </crypto>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <interface supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>default</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>passt</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </interface>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <panic supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>isa</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>hyperv</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </panic>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <console supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>null</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pty</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dev</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>file</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pipe</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>stdio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>udp</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tcp</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>unix</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>qemu-vdagent</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dbus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </console>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </devices>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <features>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <gic supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <vmcoreinfo supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <genid supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <backingStoreInput supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <backup supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <async-teardown supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <ps2 supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <sev supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <sgx supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <hyperv supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='features'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>relaxed</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vapic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>spinlocks</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vpindex</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>runtime</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>synic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>stimer</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>reset</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vendor_id</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>frequencies</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>reenlightenment</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tlbflush</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ipi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>avic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>emsr_bitmap</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>xmm_input</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <defaults>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <spinlocks>4095</spinlocks>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <stimer_direct>on</stimer_direct>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <tlbflush_direct>on</tlbflush_direct>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <tlbflush_extended>on</tlbflush_extended>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </defaults>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </hyperv>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <launchSecurity supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='sectype'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tdx</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </launchSecurity>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </features>
Dec 03 21:27:12 compute-0 nova_compute[240618]: </domainCapabilities>
Dec 03 21:27:12 compute-0 nova_compute[240618]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:11.946 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 03 21:27:12 compute-0 nova_compute[240618]: <domainCapabilities>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <path>/usr/libexec/qemu-kvm</path>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <domain>kvm</domain>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <arch>i686</arch>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <vcpu max='240'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <iothreads supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <os supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <enum name='firmware'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <loader supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>rom</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pflash</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='readonly'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>yes</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>no</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='secure'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>no</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </loader>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </os>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <cpu>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='host-passthrough' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='hostPassthroughMigratable'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>on</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>off</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='maximum' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='maximumMigratable'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>on</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>off</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='host-model' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <vendor>AMD</vendor>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='x2apic'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc-deadline'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='hypervisor'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc_adjust'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='spec-ctrl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='stibp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='cmp_legacy'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='overflow-recov'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='succor'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='amd-ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='virt-ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='lbrv'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc-scale'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='vmcb-clean'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='flushbyasid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='pause-filter'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='pfthreshold'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='svme-addr-chk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='disable' name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='custom' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Dhyana-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Genoa'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='auto-ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Genoa-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='auto-ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-128'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-256'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-512'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v6'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v7'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='KnightsMill'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512er'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512pf'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='KnightsMill-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512er'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512pf'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G4-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tbm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G5-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tbm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v618: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SierraForest'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cmpccxadd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SierraForest-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cmpccxadd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='athlon'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='athlon-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='core2duo'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='core2duo-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='coreduo'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='coreduo-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='n270'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='n270-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='phenom'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='phenom-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </cpu>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <memoryBacking supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <enum name='sourceType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>file</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>anonymous</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>memfd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </memoryBacking>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <devices>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <disk supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='diskDevice'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>disk</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>cdrom</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>floppy</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>lun</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='bus'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ide</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>fdc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>scsi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>sata</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-non-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </disk>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <graphics supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vnc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>egl-headless</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dbus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </graphics>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <video supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='modelType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vga</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>cirrus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>none</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>bochs</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ramfb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </video>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <hostdev supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='mode'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>subsystem</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='startupPolicy'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>default</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>mandatory</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>requisite</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>optional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='subsysType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pci</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>scsi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='capsType'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='pciBackend'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </hostdev>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <rng supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-non-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>random</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>egd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>builtin</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </rng>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <filesystem supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='driverType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>path</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>handle</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtiofs</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </filesystem>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <tpm supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tpm-tis</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tpm-crb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>emulator</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>external</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendVersion'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>2.0</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </tpm>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <redirdev supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='bus'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </redirdev>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <channel supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pty</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>unix</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </channel>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <crypto supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>qemu</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>builtin</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </crypto>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <interface supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>default</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>passt</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </interface>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <panic supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>isa</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>hyperv</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </panic>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <console supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>null</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pty</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dev</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>file</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pipe</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>stdio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>udp</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tcp</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>unix</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>qemu-vdagent</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dbus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </console>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </devices>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <features>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <gic supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <vmcoreinfo supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <genid supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <backingStoreInput supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <backup supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <async-teardown supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <ps2 supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <sev supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <sgx supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <hyperv supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='features'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>relaxed</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vapic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>spinlocks</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vpindex</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>runtime</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>synic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>stimer</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>reset</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vendor_id</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>frequencies</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>reenlightenment</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tlbflush</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ipi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>avic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>emsr_bitmap</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>xmm_input</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <defaults>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <spinlocks>4095</spinlocks>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <stimer_direct>on</stimer_direct>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <tlbflush_direct>on</tlbflush_direct>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <tlbflush_extended>on</tlbflush_extended>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </defaults>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </hyperv>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <launchSecurity supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='sectype'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tdx</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </launchSecurity>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </features>
Dec 03 21:27:12 compute-0 nova_compute[240618]: </domainCapabilities>
Dec 03 21:27:12 compute-0 nova_compute[240618]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.000 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.006 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 03 21:27:12 compute-0 nova_compute[240618]: <domainCapabilities>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <path>/usr/libexec/qemu-kvm</path>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <domain>kvm</domain>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <arch>x86_64</arch>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <vcpu max='240'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <iothreads supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <os supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <enum name='firmware'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <loader supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>rom</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pflash</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='readonly'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>yes</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>no</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='secure'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>no</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </loader>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </os>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <cpu>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='host-passthrough' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='hostPassthroughMigratable'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>on</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>off</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='maximum' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='maximumMigratable'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>on</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>off</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='host-model' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <vendor>AMD</vendor>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='x2apic'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc-deadline'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='hypervisor'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc_adjust'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='spec-ctrl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='stibp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='cmp_legacy'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='overflow-recov'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='succor'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='amd-ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='virt-ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='lbrv'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc-scale'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='vmcb-clean'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='flushbyasid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='pause-filter'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='pfthreshold'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='svme-addr-chk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='disable' name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='custom' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Dhyana-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Genoa'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='auto-ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Genoa-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='auto-ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-128'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-256'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-512'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v6'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v7'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='KnightsMill'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512er'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512pf'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='KnightsMill-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512er'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512pf'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G4-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tbm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G5-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tbm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SierraForest'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cmpccxadd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SierraForest-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cmpccxadd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='athlon'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='athlon-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='core2duo'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='core2duo-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='coreduo'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='coreduo-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='n270'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='n270-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='phenom'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='phenom-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </cpu>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <memoryBacking supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <enum name='sourceType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>file</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>anonymous</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>memfd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </memoryBacking>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <devices>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <disk supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='diskDevice'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>disk</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>cdrom</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>floppy</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>lun</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='bus'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ide</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>fdc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>scsi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>sata</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-non-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </disk>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <graphics supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vnc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>egl-headless</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dbus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </graphics>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <video supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='modelType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vga</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>cirrus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>none</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>bochs</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ramfb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </video>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <hostdev supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='mode'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>subsystem</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='startupPolicy'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>default</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>mandatory</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>requisite</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>optional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='subsysType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pci</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>scsi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='capsType'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='pciBackend'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </hostdev>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <rng supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-non-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>random</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>egd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>builtin</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </rng>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <filesystem supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='driverType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>path</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>handle</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtiofs</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </filesystem>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <tpm supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tpm-tis</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tpm-crb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>emulator</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>external</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendVersion'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>2.0</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </tpm>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <redirdev supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='bus'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </redirdev>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <channel supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pty</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>unix</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </channel>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <crypto supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>qemu</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>builtin</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </crypto>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <interface supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>default</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>passt</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </interface>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <panic supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>isa</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>hyperv</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </panic>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <console supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>null</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pty</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dev</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>file</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pipe</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>stdio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>udp</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tcp</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>unix</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>qemu-vdagent</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dbus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </console>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </devices>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <features>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <gic supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <vmcoreinfo supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <genid supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <backingStoreInput supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <backup supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <async-teardown supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <ps2 supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <sev supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <sgx supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <hyperv supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='features'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>relaxed</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vapic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>spinlocks</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vpindex</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>runtime</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>synic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>stimer</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>reset</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vendor_id</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>frequencies</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>reenlightenment</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tlbflush</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ipi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>avic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>emsr_bitmap</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>xmm_input</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <defaults>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <spinlocks>4095</spinlocks>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <stimer_direct>on</stimer_direct>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <tlbflush_direct>on</tlbflush_direct>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <tlbflush_extended>on</tlbflush_extended>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </defaults>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </hyperv>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <launchSecurity supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='sectype'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tdx</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </launchSecurity>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </features>
Dec 03 21:27:12 compute-0 nova_compute[240618]: </domainCapabilities>
Dec 03 21:27:12 compute-0 nova_compute[240618]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.068 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 03 21:27:12 compute-0 nova_compute[240618]: <domainCapabilities>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <path>/usr/libexec/qemu-kvm</path>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <domain>kvm</domain>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <arch>x86_64</arch>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <vcpu max='4096'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <iothreads supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <os supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <enum name='firmware'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>efi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <loader supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>rom</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pflash</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='readonly'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>yes</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>no</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='secure'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>yes</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>no</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </loader>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </os>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <cpu>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='host-passthrough' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='hostPassthroughMigratable'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>on</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>off</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='maximum' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='maximumMigratable'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>on</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>off</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='host-model' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <vendor>AMD</vendor>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='x2apic'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc-deadline'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='hypervisor'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc_adjust'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='spec-ctrl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='stibp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='cmp_legacy'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='overflow-recov'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='succor'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='amd-ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='virt-ssbd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='lbrv'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='tsc-scale'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='vmcb-clean'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='flushbyasid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='pause-filter'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='pfthreshold'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='svme-addr-chk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <feature policy='disable' name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <mode name='custom' supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Broadwell-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cascadelake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Cooperlake-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Denverton-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Dhyana-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Genoa'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='auto-ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Genoa-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='auto-ibrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Milan-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amd-psfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='stibp-always-on'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-Rome-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='EPYC-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='GraniteRapids-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-128'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-256'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx10-512'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='prefetchiti'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Haswell-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-noTSX'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v6'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Icelake-Server-v7'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='IvyBridge-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='KnightsMill'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512er'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512pf'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='KnightsMill-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512er'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512pf'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G4-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tbm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Opteron_G5-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fma4'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tbm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xop'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SapphireRapids-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='amx-tile'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-bf16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-fp16'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bitalg'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrc'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fzrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='la57'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='taa-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xfd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SierraForest'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cmpccxadd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='SierraForest-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ifma'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cmpccxadd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fbsdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='fsrs'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ibrs-all'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mcdt-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pbrsb-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='psdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='serialize'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vaes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Client-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='hle'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='rtm'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Skylake-Server-v5'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512bw'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512cd'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512dq'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512f'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='avx512vl'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='invpcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pcid'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='pku'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='mpx'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v2'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v3'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='core-capability'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='split-lock-detect'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='Snowridge-v4'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='cldemote'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='erms'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='gfni'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdir64b'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='movdiri'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='xsaves'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='athlon'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='athlon-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 sudo[241476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umpydszkvcamrkjvuizjcilymvxsiaac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797231.8246922-1557-264747370294791/AnsiballZ_systemd.py'
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='core2duo'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='core2duo-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='coreduo'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='coreduo-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='n270'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='n270-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='ss'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='phenom'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <blockers model='phenom-v1'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnow'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <feature name='3dnowext'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </blockers>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </mode>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </cpu>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <memoryBacking supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <enum name='sourceType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>file</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>anonymous</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <value>memfd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </enum>
Dec 03 21:27:12 compute-0 sudo[241476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </memoryBacking>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <devices>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <disk supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='diskDevice'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>disk</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>cdrom</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>floppy</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>lun</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='bus'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>fdc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>scsi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>sata</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-non-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </disk>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <graphics supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vnc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>egl-headless</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dbus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </graphics>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <video supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='modelType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vga</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>cirrus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>none</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>bochs</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ramfb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </video>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <hostdev supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='mode'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>subsystem</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='startupPolicy'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>default</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>mandatory</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>requisite</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>optional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='subsysType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pci</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>scsi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='capsType'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='pciBackend'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </hostdev>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <rng supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtio-non-transitional</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>random</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>egd</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>builtin</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </rng>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <filesystem supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='driverType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>path</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>handle</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>virtiofs</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </filesystem>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <tpm supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tpm-tis</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tpm-crb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>emulator</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>external</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendVersion'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>2.0</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </tpm>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <redirdev supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='bus'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>usb</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </redirdev>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <channel supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pty</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>unix</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </channel>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <crypto supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>qemu</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendModel'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>builtin</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </crypto>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <interface supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='backendType'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>default</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>passt</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </interface>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <panic supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='model'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>isa</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>hyperv</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </panic>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <console supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='type'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>null</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vc</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pty</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dev</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>file</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>pipe</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>stdio</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>udp</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tcp</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>unix</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>qemu-vdagent</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>dbus</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </console>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </devices>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   <features>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <gic supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <vmcoreinfo supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <genid supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <backingStoreInput supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <backup supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <async-teardown supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <ps2 supported='yes'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <sev supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <sgx supported='no'/>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <hyperv supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='features'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>relaxed</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vapic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>spinlocks</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vpindex</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>runtime</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>synic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>stimer</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>reset</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>vendor_id</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>frequencies</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>reenlightenment</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tlbflush</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>ipi</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>avic</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>emsr_bitmap</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>xmm_input</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <defaults>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <spinlocks>4095</spinlocks>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <stimer_direct>on</stimer_direct>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <tlbflush_direct>on</tlbflush_direct>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <tlbflush_extended>on</tlbflush_extended>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </defaults>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </hyperv>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     <launchSecurity supported='yes'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       <enum name='sectype'>
Dec 03 21:27:12 compute-0 nova_compute[240618]:         <value>tdx</value>
Dec 03 21:27:12 compute-0 nova_compute[240618]:       </enum>
Dec 03 21:27:12 compute-0 nova_compute[240618]:     </launchSecurity>
Dec 03 21:27:12 compute-0 nova_compute[240618]:   </features>
Dec 03 21:27:12 compute-0 nova_compute[240618]: </domainCapabilities>
Dec 03 21:27:12 compute-0 nova_compute[240618]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.124 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.125 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.125 240622 DEBUG nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.125 240622 INFO nova.virt.libvirt.host [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Secure Boot support detected
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.128 240622 INFO nova.virt.libvirt.driver [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.128 240622 INFO nova.virt.libvirt.driver [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.144 240622 DEBUG nova.virt.libvirt.driver [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.185 240622 INFO nova.virt.node [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Determined node identity 94aba67c-5c5e-45d0-83d1-33eb467c8775 from /var/lib/nova/compute_id
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.204 240622 WARNING nova.compute.manager [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Compute nodes ['94aba67c-5c5e-45d0-83d1-33eb467c8775'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.247 240622 INFO nova.compute.manager [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.301 240622 WARNING nova.compute.manager [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.302 240622 DEBUG oslo_concurrency.lockutils [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.302 240622 DEBUG oslo_concurrency.lockutils [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.303 240622 DEBUG oslo_concurrency.lockutils [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.303 240622 DEBUG nova.compute.resource_tracker [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.304 240622 DEBUG oslo_concurrency.processutils [None req-a441af6e-ce83-45ed-80c3-d269004b9802 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:27:12 compute-0 python3.9[241478]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 03 21:27:12 compute-0 systemd[1]: Stopping nova_compute container...
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.677 240622 DEBUG oslo_concurrency.lockutils [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.678 240622 DEBUG oslo_concurrency.lockutils [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 03 21:27:12 compute-0 nova_compute[240618]: 2025-12-03 21:27:12.678 240622 DEBUG oslo_concurrency.lockutils [None req-1d878bf7-83de-4ce0-b7c1-ade1844721e4 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 03 21:27:13 compute-0 virtqemud[241184]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 03 21:27:13 compute-0 virtqemud[241184]: hostname: compute-0
Dec 03 21:27:13 compute-0 virtqemud[241184]: End of file while reading data: Input/output error
Dec 03 21:27:13 compute-0 systemd[1]: libpod-6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255.scope: Deactivated successfully.
Dec 03 21:27:13 compute-0 systemd[1]: libpod-6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255.scope: Consumed 3.544s CPU time.
Dec 03 21:27:13 compute-0 podman[241502]: 2025-12-03 21:27:13.045315798 +0000 UTC m=+0.418307317 container died 6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:27:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255-userdata-shm.mount: Deactivated successfully.
Dec 03 21:27:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be-merged.mount: Deactivated successfully.
Dec 03 21:27:13 compute-0 sshd-session[241276]: Connection reset by authenticating user root 45.140.17.124 port 50054 [preauth]
Dec 03 21:27:13 compute-0 ceph-mon[75204]: pgmap v618: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:13 compute-0 podman[241502]: 2025-12-03 21:27:13.891690833 +0000 UTC m=+1.264682362 container cleanup 6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:27:13 compute-0 podman[241502]: nova_compute
Dec 03 21:27:13 compute-0 podman[241537]: nova_compute
Dec 03 21:27:13 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 03 21:27:13 compute-0 systemd[1]: Stopped nova_compute container.
Dec 03 21:27:13 compute-0 systemd[1]: Starting nova_compute container...
Dec 03 21:27:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v619: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:14 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5354cb7424d4793461dd352f1c9ab1213e1a51e8f188e4aa61b41b02cde121be/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:14 compute-0 podman[241550]: 2025-12-03 21:27:14.149034467 +0000 UTC m=+0.133680682 container init 6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute)
Dec 03 21:27:14 compute-0 podman[241550]: 2025-12-03 21:27:14.161986185 +0000 UTC m=+0.146632410 container start 6df18d331f2ccf5e19017d42e753cb3908d6fceb137ed1521c71025bcc6b9255 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute)
Dec 03 21:27:14 compute-0 podman[241550]: nova_compute
Dec 03 21:27:14 compute-0 nova_compute[241566]: + sudo -E kolla_set_configs
Dec 03 21:27:14 compute-0 systemd[1]: Started nova_compute container.
Dec 03 21:27:14 compute-0 sudo[241476]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Validating config file
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying service configuration files
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /etc/ceph
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Creating directory /etc/ceph
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/ceph
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Writing out command to execute
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 03 21:27:14 compute-0 nova_compute[241566]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 03 21:27:14 compute-0 nova_compute[241566]: ++ cat /run_command
Dec 03 21:27:14 compute-0 nova_compute[241566]: + CMD=nova-compute
Dec 03 21:27:14 compute-0 nova_compute[241566]: + ARGS=
Dec 03 21:27:14 compute-0 nova_compute[241566]: + sudo kolla_copy_cacerts
Dec 03 21:27:14 compute-0 nova_compute[241566]: + [[ ! -n '' ]]
Dec 03 21:27:14 compute-0 nova_compute[241566]: + . kolla_extend_start
Dec 03 21:27:14 compute-0 nova_compute[241566]: Running command: 'nova-compute'
Dec 03 21:27:14 compute-0 nova_compute[241566]: + echo 'Running command: '\''nova-compute'\'''
Dec 03 21:27:14 compute-0 nova_compute[241566]: + umask 0022
Dec 03 21:27:14 compute-0 nova_compute[241566]: + exec nova-compute
Dec 03 21:27:14 compute-0 podman[241572]: 2025-12-03 21:27:14.359484485 +0000 UTC m=+0.146419283 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:27:14 compute-0 sudo[241752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewvxfwdkbawnimturbfhxikdxozgkicf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764797234.539059-1566-148337382814838/AnsiballZ_podman_container.py'
Dec 03 21:27:14 compute-0 sudo[241752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:27:15 compute-0 ceph-mon[75204]: pgmap v619: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:15 compute-0 python3.9[241754]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 03 21:27:15 compute-0 sshd-session[241530]: Connection reset by authenticating user root 45.140.17.124 port 50064 [preauth]
Dec 03 21:27:15 compute-0 systemd[1]: Started libpod-conmon-5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679.scope.
Dec 03 21:27:15 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b456d77c18c21606f98af99f9de0fc93246ff32666866c092acc1dee3c5deb7/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b456d77c18c21606f98af99f9de0fc93246ff32666866c092acc1dee3c5deb7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b456d77c18c21606f98af99f9de0fc93246ff32666866c092acc1dee3c5deb7/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:15 compute-0 podman[241781]: 2025-12-03 21:27:15.719112279 +0000 UTC m=+0.161616840 container init 5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec 03 21:27:15 compute-0 podman[241781]: 2025-12-03 21:27:15.729888648 +0000 UTC m=+0.172393179 container start 5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:27:15 compute-0 python3.9[241754]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Applying nova statedir ownership
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 03 21:27:15 compute-0 nova_compute_init[241802]: INFO:nova_statedir:Nova statedir ownership complete
Dec 03 21:27:15 compute-0 systemd[1]: libpod-5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679.scope: Deactivated successfully.
Dec 03 21:27:15 compute-0 podman[241803]: 2025-12-03 21:27:15.811446843 +0000 UTC m=+0.045539061 container died 5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 21:27:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679-userdata-shm.mount: Deactivated successfully.
Dec 03 21:27:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-5b456d77c18c21606f98af99f9de0fc93246ff32666866c092acc1dee3c5deb7-merged.mount: Deactivated successfully.
Dec 03 21:27:15 compute-0 podman[241814]: 2025-12-03 21:27:15.868533173 +0000 UTC m=+0.062550857 container cleanup 5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:27:15 compute-0 systemd[1]: libpod-conmon-5f2a8f92af126201e4bcea7b6c8994833207632e395a767ac859e71af7660679.scope: Deactivated successfully.
Dec 03 21:27:15 compute-0 sudo[241752]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v620: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:16 compute-0 sshd-session[211489]: Connection closed by 192.168.122.30 port 55546
Dec 03 21:27:16 compute-0 sshd-session[211486]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:27:16 compute-0 systemd[1]: session-49.scope: Deactivated successfully.
Dec 03 21:27:16 compute-0 systemd[1]: session-49.scope: Consumed 2min 41.092s CPU time.
Dec 03 21:27:16 compute-0 systemd-logind[787]: Session 49 logged out. Waiting for processes to exit.
Dec 03 21:27:16 compute-0 nova_compute[241566]: 2025-12-03 21:27:16.390 241570 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 03 21:27:16 compute-0 nova_compute[241566]: 2025-12-03 21:27:16.390 241570 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 03 21:27:16 compute-0 nova_compute[241566]: 2025-12-03 21:27:16.391 241570 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 03 21:27:16 compute-0 nova_compute[241566]: 2025-12-03 21:27:16.391 241570 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 03 21:27:16 compute-0 systemd-logind[787]: Removed session 49.
Dec 03 21:27:16 compute-0 nova_compute[241566]: 2025-12-03 21:27:16.528 241570 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:27:16 compute-0 nova_compute[241566]: 2025-12-03 21:27:16.540 241570 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:27:16 compute-0 nova_compute[241566]: 2025-12-03 21:27:16.541 241570 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.006 241570 INFO nova.virt.driver [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.132 241570 INFO nova.compute.provider_config [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.151 241570 DEBUG oslo_concurrency.lockutils [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.151 241570 DEBUG oslo_concurrency.lockutils [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.152 241570 DEBUG oslo_concurrency.lockutils [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.152 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.152 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.152 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.153 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.153 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.153 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.153 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.153 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.154 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.154 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.154 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.154 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.154 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.155 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.155 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.155 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.155 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.155 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.156 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.156 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.156 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.156 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.156 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.157 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.157 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.157 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.157 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.157 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.158 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.158 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.158 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.158 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.158 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.159 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.159 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.159 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.159 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.159 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.160 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.160 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.160 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.160 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.160 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.161 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.161 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.161 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.161 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.161 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.162 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.162 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.162 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.162 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.162 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.163 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.163 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.163 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.163 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.163 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.163 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.164 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.164 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.164 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.164 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.164 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.165 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.165 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.165 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.165 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.165 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.165 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.166 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.166 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.166 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.166 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.166 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.166 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.167 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.167 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.167 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.167 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.167 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.168 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.168 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.168 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.168 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.168 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.169 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.169 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.169 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.169 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.169 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.169 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.170 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.170 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.170 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.170 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.170 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.171 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.171 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.171 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.171 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.171 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.171 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.172 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.172 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.172 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.172 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.172 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.173 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.173 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.173 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.173 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.173 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.173 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.174 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.174 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.174 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.174 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.174 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.175 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.175 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.175 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.175 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.175 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.175 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.176 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.176 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.176 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.176 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.176 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.177 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.177 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.177 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.177 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.177 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.177 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.177 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.177 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.178 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.178 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.178 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.178 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.178 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.178 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.178 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.179 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.179 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.179 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.179 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.179 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.179 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.179 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.179 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.180 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.180 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.180 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.180 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.180 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.180 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.181 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.181 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.181 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.181 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.181 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.181 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.181 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.181 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.182 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.182 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.182 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.182 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.182 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.182 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.182 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.183 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.183 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.183 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.183 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.183 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.183 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.183 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.183 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.184 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.184 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.184 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.184 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.184 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.184 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.184 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.185 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.185 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.185 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.185 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.185 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.185 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.185 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.186 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.186 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.186 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.186 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.186 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.186 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.186 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.186 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.187 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.187 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.187 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.187 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.187 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.187 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.187 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.188 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.188 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.188 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.188 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.188 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.188 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.188 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.188 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.189 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.189 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.189 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.189 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.189 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.189 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.189 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.190 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.190 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.190 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.190 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.190 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.190 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.190 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.190 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.191 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.191 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.191 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.191 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.191 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.191 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.191 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.192 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.192 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.192 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.192 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.192 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.192 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.192 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.192 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.193 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.193 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.193 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.193 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.193 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.193 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.193 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.194 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.194 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.194 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.194 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.194 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.194 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.194 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.194 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.195 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.195 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.195 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.195 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.195 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.195 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.195 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.196 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.196 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.196 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.196 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.196 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.196 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.196 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.197 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.197 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.197 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.197 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.197 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.197 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.197 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.197 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.198 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.198 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.198 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.198 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.198 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.198 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.198 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.199 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.199 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.199 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.199 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.199 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.199 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.199 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.199 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.200 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.200 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.200 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.200 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.200 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.200 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.200 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.201 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.201 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.201 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.201 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.201 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.201 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.201 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.201 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.202 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.202 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.202 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.202 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.202 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.202 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.202 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.203 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.203 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.203 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.203 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.203 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.203 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.203 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.204 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.204 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.204 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.204 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.204 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.204 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.204 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.204 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.205 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.205 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.205 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.205 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.205 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.205 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.205 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.206 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.206 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.206 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.206 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.206 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.206 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.206 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.206 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.207 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.207 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.207 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.207 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.207 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.207 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.208 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.208 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.208 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.208 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.208 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.208 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.208 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.209 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.209 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.209 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.209 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.209 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.209 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.209 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.209 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.210 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.210 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.210 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.210 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.210 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.210 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.210 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.211 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.211 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.211 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.211 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.211 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.211 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.212 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.212 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.212 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.212 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.212 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.212 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.212 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.213 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.213 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.213 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.213 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.213 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.213 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.213 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.213 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.214 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.214 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.214 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.214 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.214 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.214 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.214 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.215 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.215 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.215 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.215 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.215 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.215 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.215 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.215 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.216 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.216 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.216 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.216 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.216 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.216 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.216 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.217 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.217 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.217 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.217 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.217 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.217 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.217 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.217 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.218 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.218 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.218 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.218 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.218 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.218 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.218 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.219 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.219 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.219 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.219 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.219 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.219 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.219 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.220 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.220 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.220 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.220 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.220 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 ceph-mon[75204]: pgmap v620: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.220 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.220 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.221 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.221 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.221 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.221 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.221 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.221 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.221 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.221 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.222 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.222 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.222 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.222 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.222 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.222 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.223 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.223 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.223 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.223 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.223 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.223 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.223 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.224 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.224 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.224 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.224 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.224 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.224 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.224 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.225 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.225 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.225 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.225 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.226 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.226 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.226 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.226 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.226 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.226 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.227 241570 WARNING oslo_config.cfg [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 03 21:27:17 compute-0 nova_compute[241566]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 03 21:27:17 compute-0 nova_compute[241566]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 03 21:27:17 compute-0 nova_compute[241566]: and ``live_migration_inbound_addr`` respectively.
Dec 03 21:27:17 compute-0 nova_compute[241566]: ).  Its value may be silently ignored in the future.
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.227 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.227 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.227 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.227 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.227 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.228 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.228 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.228 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.228 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.228 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.228 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.228 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.228 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.229 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.229 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.229 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.229 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.229 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.229 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rbd_secret_uuid        = c21de27e-a7fd-594b-8324-0697ba9aab3a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.229 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.230 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.230 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.230 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.230 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.230 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.230 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.230 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.231 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.231 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.231 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.231 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.231 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.231 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.231 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.232 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.232 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.232 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.232 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.232 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.232 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.232 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.233 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.233 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.233 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.233 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.233 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.233 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.233 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.233 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.234 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.234 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.234 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.234 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.234 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.234 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.234 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.235 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.235 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.235 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.235 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.235 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.235 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.235 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.235 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.236 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.236 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.236 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.236 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.236 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.236 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.236 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.237 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.237 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.237 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.237 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.237 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.237 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.237 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.238 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.238 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.238 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.238 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.238 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.238 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.238 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.238 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.239 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.239 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.239 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.239 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.239 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.239 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.239 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.240 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.240 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.240 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.240 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.240 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.240 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.240 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.240 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.241 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.241 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.241 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.241 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.241 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.241 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.241 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.242 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.242 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.242 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.242 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.242 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.242 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.242 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.242 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.243 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.243 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.243 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.243 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.243 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.243 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.243 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.243 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.244 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.244 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.244 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.244 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.244 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.244 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.244 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.245 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.245 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.245 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.245 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.245 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.245 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.245 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.245 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.246 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.246 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.246 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.246 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.246 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.246 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.247 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.247 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.247 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.247 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.247 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.247 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.247 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.248 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.248 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.248 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.248 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.248 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.248 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.248 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.249 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.249 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.249 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.249 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.249 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.249 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.249 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.249 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.250 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.250 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.250 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.250 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.250 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.250 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.250 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.251 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.251 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.251 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.251 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.251 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.251 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.251 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.252 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.252 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.252 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.252 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.252 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.252 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.252 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.253 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.253 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.253 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.253 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.253 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.253 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.253 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.253 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.254 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.254 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.254 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.254 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.254 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.254 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.254 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.255 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.255 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.255 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.255 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.255 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.255 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.256 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.256 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.256 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.256 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.256 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.256 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.256 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.257 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.257 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.257 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.257 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.257 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.257 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.257 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.258 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.258 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.258 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.258 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.258 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.258 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.258 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.259 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.259 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.259 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.259 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.259 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.259 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.259 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.259 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.260 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.260 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.260 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.260 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.260 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.260 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.261 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.261 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.261 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.261 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.261 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.261 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.262 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.262 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.262 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.262 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.262 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.262 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.262 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.263 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.263 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.263 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.263 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.263 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.263 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.263 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.264 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.264 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.264 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.264 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.264 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.264 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.264 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.265 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.265 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.265 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.265 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.265 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.265 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.265 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.266 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.266 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.266 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.266 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.266 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.266 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.266 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.267 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.267 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.267 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.267 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.267 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.267 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.268 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.268 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.268 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.268 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.268 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.268 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.268 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.269 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.269 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.269 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.269 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.269 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.269 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.269 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.270 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.270 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.270 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.270 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.270 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.270 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.271 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.271 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.271 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.271 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.271 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.271 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.272 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.272 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.272 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.272 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.272 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.272 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.273 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.273 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.273 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.273 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.273 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.273 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.273 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.274 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.274 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.274 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.274 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.274 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.274 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.275 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.275 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.275 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.275 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.275 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.275 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.275 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.276 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.276 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.276 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.276 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.276 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.276 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.276 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.277 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.277 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.277 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.277 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.277 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.277 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.277 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.277 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.278 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.278 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.278 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.278 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.278 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.278 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.278 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.279 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.279 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.279 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.279 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.279 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.279 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.279 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.280 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.280 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.280 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.280 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.280 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.280 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.280 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.281 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.281 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.281 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.281 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.281 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.281 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.281 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.282 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.282 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.282 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.282 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.282 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.282 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.282 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.283 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.283 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.283 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.283 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.283 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.283 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.283 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.284 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.284 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.284 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.284 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.284 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.284 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.285 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.285 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.285 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.285 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.285 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.285 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.285 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.285 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.286 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.286 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.286 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.286 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.286 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.286 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.286 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.287 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.287 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.287 241570 DEBUG oslo_service.service [None req-4eb91c96-5f1b-46df-abf8-c2655c7f26a2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.288 241570 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.306 241570 INFO nova.virt.node [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Determined node identity 94aba67c-5c5e-45d0-83d1-33eb467c8775 from /var/lib/nova/compute_id
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.307 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.307 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.308 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.308 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.318 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f3da91c41c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.320 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f3da91c41c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.320 241570 INFO nova.virt.libvirt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Connection event '1' reason 'None'
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.325 241570 INFO nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Libvirt host capabilities <capabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]: 
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <host>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <uuid>fe808748-0a27-4a3c-9875-a9777da5fa17</uuid>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <arch>x86_64</arch>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model>EPYC-Rome-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <vendor>AMD</vendor>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <microcode version='16777317'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <signature family='23' model='49' stepping='0'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='x2apic'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='tsc-deadline'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='osxsave'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='hypervisor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='tsc_adjust'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='spec-ctrl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='stibp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='arch-capabilities'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='cmp_legacy'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='topoext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='virt-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='lbrv'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='tsc-scale'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='vmcb-clean'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='pause-filter'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='pfthreshold'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='svme-addr-chk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='rdctl-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='skip-l1dfl-vmentry'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='mds-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature name='pschange-mc-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <pages unit='KiB' size='4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <pages unit='KiB' size='2048'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <pages unit='KiB' size='1048576'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <power_management>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <suspend_mem/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </power_management>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <iommu support='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <migration_features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <live/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <uri_transports>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <uri_transport>tcp</uri_transport>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <uri_transport>rdma</uri_transport>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </uri_transports>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </migration_features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <topology>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <cells num='1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <cell id='0'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:           <memory unit='KiB'>7864316</memory>
Dec 03 21:27:17 compute-0 nova_compute[241566]:           <pages unit='KiB' size='4'>1966079</pages>
Dec 03 21:27:17 compute-0 nova_compute[241566]:           <pages unit='KiB' size='2048'>0</pages>
Dec 03 21:27:17 compute-0 nova_compute[241566]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 03 21:27:17 compute-0 nova_compute[241566]:           <distances>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <sibling id='0' value='10'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:           </distances>
Dec 03 21:27:17 compute-0 nova_compute[241566]:           <cpus num='8'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:           </cpus>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         </cell>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </cells>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </topology>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <cache>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </cache>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <secmodel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model>selinux</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <doi>0</doi>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </secmodel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <secmodel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model>dac</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <doi>0</doi>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </secmodel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </host>
Dec 03 21:27:17 compute-0 nova_compute[241566]: 
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <guest>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <os_type>hvm</os_type>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <arch name='i686'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <wordsize>32</wordsize>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <domain type='qemu'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <domain type='kvm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </arch>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <pae/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <nonpae/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <acpi default='on' toggle='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <apic default='on' toggle='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <cpuselection/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <deviceboot/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <disksnapshot default='on' toggle='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <externalSnapshot/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </guest>
Dec 03 21:27:17 compute-0 nova_compute[241566]: 
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <guest>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <os_type>hvm</os_type>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <arch name='x86_64'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <wordsize>64</wordsize>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <domain type='qemu'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <domain type='kvm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </arch>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <acpi default='on' toggle='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <apic default='on' toggle='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <cpuselection/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <deviceboot/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <disksnapshot default='on' toggle='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <externalSnapshot/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </guest>
Dec 03 21:27:17 compute-0 nova_compute[241566]: 
Dec 03 21:27:17 compute-0 nova_compute[241566]: </capabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]: 
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.336 241570 DEBUG nova.virt.libvirt.volume.mount [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.337 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.340 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 03 21:27:17 compute-0 nova_compute[241566]: <domainCapabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <path>/usr/libexec/qemu-kvm</path>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <domain>kvm</domain>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <arch>i686</arch>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <vcpu max='4096'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <iothreads supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <os supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <enum name='firmware'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <loader supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>rom</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pflash</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='readonly'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>yes</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>no</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='secure'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>no</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </loader>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </os>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='host-passthrough' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='hostPassthroughMigratable'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>on</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>off</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='maximum' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='maximumMigratable'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>on</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>off</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='host-model' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <vendor>AMD</vendor>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='x2apic'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc-deadline'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='hypervisor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc_adjust'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='spec-ctrl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='stibp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='cmp_legacy'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='overflow-recov'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='succor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='amd-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='virt-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='lbrv'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc-scale'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='vmcb-clean'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='flushbyasid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='pause-filter'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='pfthreshold'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='svme-addr-chk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='disable' name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='custom' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Dhyana-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Genoa'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='auto-ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Genoa-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='auto-ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-128'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-256'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-512'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v6'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v7'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='KnightsMill'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512er'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512pf'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='KnightsMill-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512er'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512pf'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G4-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tbm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G5-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tbm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SierraForest'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cmpccxadd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SierraForest-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cmpccxadd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='athlon'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='athlon-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='core2duo'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='core2duo-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='coreduo'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='coreduo-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='n270'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='n270-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='phenom'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='phenom-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <memoryBacking supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <enum name='sourceType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>file</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>anonymous</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>memfd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </memoryBacking>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <devices>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <disk supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='diskDevice'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>disk</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>cdrom</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>floppy</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>lun</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='bus'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>fdc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>scsi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>sata</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-non-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </disk>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <graphics supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vnc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>egl-headless</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dbus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </graphics>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <video supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='modelType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vga</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>cirrus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>none</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>bochs</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ramfb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </video>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <hostdev supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='mode'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>subsystem</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='startupPolicy'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>default</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>mandatory</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>requisite</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>optional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='subsysType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pci</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>scsi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='capsType'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='pciBackend'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </hostdev>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <rng supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-non-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>random</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>egd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>builtin</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </rng>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <filesystem supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='driverType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>path</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>handle</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtiofs</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </filesystem>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <tpm supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tpm-tis</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tpm-crb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>emulator</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>external</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendVersion'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>2.0</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </tpm>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <redirdev supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='bus'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </redirdev>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <channel supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pty</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>unix</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </channel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <crypto supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>qemu</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>builtin</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </crypto>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <interface supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>default</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>passt</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </interface>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <panic supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>isa</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>hyperv</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </panic>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <console supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>null</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pty</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dev</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>file</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pipe</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>stdio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>udp</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tcp</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>unix</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>qemu-vdagent</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dbus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </console>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </devices>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <gic supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <vmcoreinfo supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <genid supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <backingStoreInput supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <backup supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <async-teardown supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <ps2 supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <sev supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <sgx supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <hyperv supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='features'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>relaxed</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vapic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>spinlocks</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vpindex</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>runtime</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>synic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>stimer</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>reset</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vendor_id</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>frequencies</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>reenlightenment</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tlbflush</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ipi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>avic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>emsr_bitmap</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>xmm_input</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <defaults>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <spinlocks>4095</spinlocks>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <stimer_direct>on</stimer_direct>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <tlbflush_direct>on</tlbflush_direct>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <tlbflush_extended>on</tlbflush_extended>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </defaults>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </hyperv>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <launchSecurity supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='sectype'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tdx</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </launchSecurity>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </features>
Dec 03 21:27:17 compute-0 nova_compute[241566]: </domainCapabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.345 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 03 21:27:17 compute-0 nova_compute[241566]: <domainCapabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <path>/usr/libexec/qemu-kvm</path>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <domain>kvm</domain>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <arch>i686</arch>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <vcpu max='240'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <iothreads supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <os supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <enum name='firmware'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <loader supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>rom</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pflash</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='readonly'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>yes</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>no</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='secure'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>no</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </loader>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </os>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='host-passthrough' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='hostPassthroughMigratable'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>on</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>off</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='maximum' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='maximumMigratable'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>on</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>off</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='host-model' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <vendor>AMD</vendor>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='x2apic'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc-deadline'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='hypervisor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc_adjust'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='spec-ctrl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='stibp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='cmp_legacy'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='overflow-recov'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='succor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='amd-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='virt-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='lbrv'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc-scale'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='vmcb-clean'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='flushbyasid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='pause-filter'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='pfthreshold'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='svme-addr-chk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='disable' name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='custom' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Dhyana-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Genoa'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='auto-ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Genoa-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='auto-ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-128'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-256'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-512'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v6'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v7'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='KnightsMill'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512er'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512pf'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='KnightsMill-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512er'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512pf'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G4-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tbm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G5-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tbm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SierraForest'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cmpccxadd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SierraForest-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cmpccxadd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='athlon'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='athlon-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='core2duo'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='core2duo-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='coreduo'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='coreduo-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='n270'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='n270-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='phenom'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='phenom-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <memoryBacking supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <enum name='sourceType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>file</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>anonymous</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>memfd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </memoryBacking>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <devices>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <disk supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='diskDevice'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>disk</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>cdrom</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>floppy</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>lun</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='bus'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ide</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>fdc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>scsi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>sata</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-non-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </disk>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <graphics supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vnc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>egl-headless</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dbus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </graphics>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <video supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='modelType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vga</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>cirrus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>none</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>bochs</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ramfb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </video>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <hostdev supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='mode'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>subsystem</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='startupPolicy'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>default</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>mandatory</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>requisite</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>optional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='subsysType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pci</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>scsi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='capsType'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='pciBackend'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </hostdev>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <rng supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-non-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>random</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>egd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>builtin</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </rng>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <filesystem supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='driverType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>path</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>handle</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtiofs</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </filesystem>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <tpm supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tpm-tis</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tpm-crb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>emulator</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>external</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendVersion'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>2.0</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </tpm>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <redirdev supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='bus'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </redirdev>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <channel supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pty</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>unix</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </channel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <crypto supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>qemu</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>builtin</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </crypto>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <interface supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>default</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>passt</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </interface>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <panic supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>isa</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>hyperv</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </panic>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <console supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>null</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pty</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dev</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>file</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pipe</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>stdio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>udp</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tcp</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>unix</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>qemu-vdagent</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dbus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </console>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </devices>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <gic supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <vmcoreinfo supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <genid supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <backingStoreInput supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <backup supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <async-teardown supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <ps2 supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <sev supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <sgx supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <hyperv supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='features'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>relaxed</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vapic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>spinlocks</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vpindex</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>runtime</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>synic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>stimer</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>reset</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vendor_id</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>frequencies</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>reenlightenment</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tlbflush</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ipi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>avic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>emsr_bitmap</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>xmm_input</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <defaults>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <spinlocks>4095</spinlocks>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <stimer_direct>on</stimer_direct>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <tlbflush_direct>on</tlbflush_direct>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <tlbflush_extended>on</tlbflush_extended>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </defaults>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </hyperv>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <launchSecurity supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='sectype'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tdx</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </launchSecurity>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </features>
Dec 03 21:27:17 compute-0 nova_compute[241566]: </domainCapabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.373 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.378 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 03 21:27:17 compute-0 nova_compute[241566]: <domainCapabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <path>/usr/libexec/qemu-kvm</path>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <domain>kvm</domain>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <arch>x86_64</arch>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <vcpu max='4096'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <iothreads supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <os supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <enum name='firmware'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>efi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <loader supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>rom</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pflash</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='readonly'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>yes</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>no</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='secure'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>yes</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>no</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </loader>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </os>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='host-passthrough' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='hostPassthroughMigratable'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>on</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>off</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='maximum' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='maximumMigratable'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>on</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>off</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='host-model' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <vendor>AMD</vendor>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='x2apic'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc-deadline'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='hypervisor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc_adjust'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='spec-ctrl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='stibp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='cmp_legacy'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='overflow-recov'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='succor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='amd-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='virt-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='lbrv'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc-scale'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='vmcb-clean'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='flushbyasid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='pause-filter'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='pfthreshold'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='svme-addr-chk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='disable' name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='custom' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Dhyana-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Genoa'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='auto-ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Genoa-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='auto-ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-128'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-256'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-512'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v6'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v7'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='KnightsMill'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512er'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512pf'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='KnightsMill-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512er'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512pf'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G4-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tbm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G5-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tbm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SierraForest'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cmpccxadd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SierraForest-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cmpccxadd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='athlon'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='athlon-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='core2duo'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='core2duo-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='coreduo'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='coreduo-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='n270'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='n270-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='phenom'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='phenom-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <memoryBacking supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <enum name='sourceType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>file</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>anonymous</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>memfd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </memoryBacking>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <devices>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <disk supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='diskDevice'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>disk</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>cdrom</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>floppy</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>lun</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='bus'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>fdc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>scsi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>sata</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-non-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </disk>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <graphics supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vnc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>egl-headless</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dbus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </graphics>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <video supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='modelType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vga</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>cirrus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>none</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>bochs</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ramfb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </video>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <hostdev supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='mode'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>subsystem</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='startupPolicy'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>default</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>mandatory</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>requisite</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>optional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='subsysType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pci</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>scsi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='capsType'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='pciBackend'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </hostdev>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <rng supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-non-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>random</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>egd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>builtin</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </rng>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <filesystem supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='driverType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>path</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>handle</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtiofs</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </filesystem>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <tpm supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tpm-tis</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tpm-crb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>emulator</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>external</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendVersion'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>2.0</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </tpm>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <redirdev supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='bus'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </redirdev>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <channel supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pty</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>unix</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </channel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <crypto supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>qemu</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>builtin</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </crypto>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <interface supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>default</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>passt</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </interface>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <panic supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>isa</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>hyperv</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </panic>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <console supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>null</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pty</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dev</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>file</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pipe</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>stdio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>udp</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tcp</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>unix</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>qemu-vdagent</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dbus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </console>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </devices>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <gic supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <vmcoreinfo supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <genid supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <backingStoreInput supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <backup supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <async-teardown supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <ps2 supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <sev supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <sgx supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <hyperv supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='features'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>relaxed</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vapic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>spinlocks</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vpindex</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>runtime</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>synic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>stimer</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>reset</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vendor_id</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>frequencies</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>reenlightenment</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tlbflush</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ipi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>avic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>emsr_bitmap</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>xmm_input</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <defaults>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <spinlocks>4095</spinlocks>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <stimer_direct>on</stimer_direct>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <tlbflush_direct>on</tlbflush_direct>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <tlbflush_extended>on</tlbflush_extended>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </defaults>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </hyperv>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <launchSecurity supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='sectype'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tdx</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </launchSecurity>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </features>
Dec 03 21:27:17 compute-0 nova_compute[241566]: </domainCapabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.442 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 03 21:27:17 compute-0 nova_compute[241566]: <domainCapabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <path>/usr/libexec/qemu-kvm</path>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <domain>kvm</domain>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <arch>x86_64</arch>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <vcpu max='240'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <iothreads supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <os supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <enum name='firmware'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <loader supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>rom</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pflash</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='readonly'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>yes</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>no</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='secure'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>no</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </loader>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </os>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='host-passthrough' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='hostPassthroughMigratable'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>on</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>off</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='maximum' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='maximumMigratable'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>on</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>off</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='host-model' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <vendor>AMD</vendor>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='x2apic'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc-deadline'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='hypervisor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc_adjust'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='spec-ctrl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='stibp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='cmp_legacy'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='overflow-recov'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='succor'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='amd-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='virt-ssbd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='lbrv'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='tsc-scale'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='vmcb-clean'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='flushbyasid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='pause-filter'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='pfthreshold'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='svme-addr-chk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <feature policy='disable' name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <mode name='custom' supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Broadwell-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cascadelake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Cooperlake-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Denverton-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Dhyana-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Genoa'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='auto-ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Genoa-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='auto-ibrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Milan-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amd-psfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='no-nested-data-bp'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='null-sel-clr-base'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='stibp-always-on'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-Rome-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='EPYC-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='GraniteRapids-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-128'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-256'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx10-512'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='prefetchiti'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Haswell-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-noTSX'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v6'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Icelake-Server-v7'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='IvyBridge-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='KnightsMill'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512er'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512pf'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='KnightsMill-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4fmaps'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-4vnniw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512er'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512pf'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G4-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tbm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Opteron_G5-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fma4'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tbm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xop'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SapphireRapids-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='amx-tile'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-bf16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-fp16'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512-vpopcntdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bitalg'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vbmi2'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrc'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fzrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='la57'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='taa-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='tsx-ldtrk'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xfd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SierraForest'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cmpccxadd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='SierraForest-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ifma'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-ne-convert'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx-vnni-int8'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='bus-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cmpccxadd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fbsdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='fsrs'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ibrs-all'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mcdt-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pbrsb-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='psdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='sbdr-ssdp-no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='serialize'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vaes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='vpclmulqdq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Client-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='hle'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='rtm'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Skylake-Server-v5'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512bw'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512cd'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512dq'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512f'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='avx512vl'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='invpcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pcid'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='pku'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='mpx'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v2'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v3'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='core-capability'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='split-lock-detect'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='Snowridge-v4'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='cldemote'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='erms'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='gfni'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdir64b'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='movdiri'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='xsaves'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='athlon'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='athlon-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='core2duo'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='core2duo-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='coreduo'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='coreduo-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='n270'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='n270-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='ss'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='phenom'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <blockers model='phenom-v1'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnow'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <feature name='3dnowext'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </blockers>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </mode>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </cpu>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <memoryBacking supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <enum name='sourceType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>file</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>anonymous</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <value>memfd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </memoryBacking>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <devices>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <disk supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='diskDevice'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>disk</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>cdrom</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>floppy</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>lun</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='bus'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ide</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>fdc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>scsi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>sata</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-non-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </disk>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <graphics supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vnc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>egl-headless</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dbus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </graphics>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <video supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='modelType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vga</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>cirrus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>none</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>bochs</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ramfb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </video>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <hostdev supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='mode'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>subsystem</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='startupPolicy'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>default</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>mandatory</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>requisite</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>optional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='subsysType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pci</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>scsi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='capsType'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='pciBackend'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </hostdev>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <rng supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtio-non-transitional</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>random</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>egd</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>builtin</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </rng>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <filesystem supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='driverType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>path</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>handle</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>virtiofs</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </filesystem>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <tpm supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tpm-tis</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tpm-crb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>emulator</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>external</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendVersion'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>2.0</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </tpm>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <redirdev supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='bus'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>usb</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </redirdev>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <channel supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pty</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>unix</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </channel>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <crypto supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>qemu</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendModel'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>builtin</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </crypto>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <interface supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='backendType'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>default</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>passt</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </interface>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <panic supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='model'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>isa</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>hyperv</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </panic>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <console supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='type'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>null</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vc</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pty</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dev</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>file</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>pipe</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>stdio</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>udp</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tcp</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>unix</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>qemu-vdagent</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>dbus</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </console>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </devices>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   <features>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <gic supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <vmcoreinfo supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <genid supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <backingStoreInput supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <backup supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <async-teardown supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <ps2 supported='yes'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <sev supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <sgx supported='no'/>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <hyperv supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='features'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>relaxed</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vapic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>spinlocks</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vpindex</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>runtime</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>synic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>stimer</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>reset</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>vendor_id</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>frequencies</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>reenlightenment</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tlbflush</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>ipi</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>avic</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>emsr_bitmap</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>xmm_input</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <defaults>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <spinlocks>4095</spinlocks>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <stimer_direct>on</stimer_direct>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <tlbflush_direct>on</tlbflush_direct>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <tlbflush_extended>on</tlbflush_extended>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </defaults>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </hyperv>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     <launchSecurity supported='yes'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       <enum name='sectype'>
Dec 03 21:27:17 compute-0 nova_compute[241566]:         <value>tdx</value>
Dec 03 21:27:17 compute-0 nova_compute[241566]:       </enum>
Dec 03 21:27:17 compute-0 nova_compute[241566]:     </launchSecurity>
Dec 03 21:27:17 compute-0 nova_compute[241566]:   </features>
Dec 03 21:27:17 compute-0 nova_compute[241566]: </domainCapabilities>
Dec 03 21:27:17 compute-0 nova_compute[241566]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.506 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.506 241570 INFO nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Secure Boot support detected
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.508 241570 INFO nova.virt.libvirt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.508 241570 INFO nova.virt.libvirt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.517 241570 DEBUG nova.virt.libvirt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.539 241570 INFO nova.virt.node [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Determined node identity 94aba67c-5c5e-45d0-83d1-33eb467c8775 from /var/lib/nova/compute_id
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.554 241570 WARNING nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Compute nodes ['94aba67c-5c5e-45d0-83d1-33eb467c8775'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.579 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.608 241570 WARNING nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.608 241570 DEBUG oslo_concurrency.lockutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.609 241570 DEBUG oslo_concurrency.lockutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.609 241570 DEBUG oslo_concurrency.lockutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.609 241570 DEBUG nova.compute.resource_tracker [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:27:17 compute-0 nova_compute[241566]: 2025-12-03 21:27:17.609 241570 DEBUG oslo_concurrency.processutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:27:17 compute-0 rsyslogd[1006]: imjournal from <np0005544708:nova_compute>: begin to drop messages due to rate-limiting
Dec 03 21:27:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v621: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:27:18 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/167283569' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.109 241570 DEBUG oslo_concurrency.processutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:27:18 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 03 21:27:18 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 03 21:27:18 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/167283569' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.499 241570 WARNING nova.virt.libvirt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.501 241570 DEBUG nova.compute.resource_tracker [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5305MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.501 241570 DEBUG oslo_concurrency.lockutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.502 241570 DEBUG oslo_concurrency.lockutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.524 241570 WARNING nova.compute.resource_tracker [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] No compute node record for compute-0.ctlplane.example.com:94aba67c-5c5e-45d0-83d1-33eb467c8775: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 94aba67c-5c5e-45d0-83d1-33eb467c8775 could not be found.
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.545 241570 INFO nova.compute.resource_tracker [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 94aba67c-5c5e-45d0-83d1-33eb467c8775
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.631 241570 DEBUG nova.compute.resource_tracker [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:27:18 compute-0 nova_compute[241566]: 2025-12-03 21:27:18.631 241570 DEBUG nova.compute.resource_tracker [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:27:19 compute-0 ceph-mon[75204]: pgmap v621: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:19 compute-0 nova_compute[241566]: 2025-12-03 21:27:19.632 241570 INFO nova.scheduler.client.report [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [req-0d954d10-19c0-486b-918d-638cc6b61e2d] Created resource provider record via placement API for resource provider with UUID 94aba67c-5c5e-45d0-83d1-33eb467c8775 and name compute-0.ctlplane.example.com.
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.041 241570 DEBUG oslo_concurrency.processutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:27:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v622: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:27:20 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3279115878' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.572 241570 DEBUG oslo_concurrency.processutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.579 241570 DEBUG nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 03 21:27:20 compute-0 nova_compute[241566]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.580 241570 INFO nova.virt.libvirt.host [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] kernel doesn't support AMD SEV
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.580 241570 DEBUG nova.compute.provider_tree [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Updating inventory in ProviderTree for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.581 241570 DEBUG nova.virt.libvirt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.654 241570 DEBUG nova.scheduler.client.report [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Updated inventory for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.656 241570 DEBUG nova.compute.provider_tree [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Updating resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.656 241570 DEBUG nova.compute.provider_tree [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Updating inventory in ProviderTree for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.808 241570 DEBUG nova.compute.provider_tree [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Updating resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.841 241570 DEBUG nova.compute.resource_tracker [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.842 241570 DEBUG oslo_concurrency.lockutils [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.843 241570 DEBUG nova.service [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.962 241570 DEBUG nova.service [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 03 21:27:20 compute-0 nova_compute[241566]: 2025-12-03 21:27:20.963 241570 DEBUG nova.servicegroup.drivers.db [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 03 21:27:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:27:21
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['vms', 'cephfs.cephfs.meta', 'cephfs.cephfs.data', 'images', 'volumes', 'backups', '.mgr']
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:27:21 compute-0 ceph-mon[75204]: pgmap v622: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:21 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3279115878' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:27:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:27:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v623: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:23 compute-0 ceph-mon[75204]: pgmap v623: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v624: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:25 compute-0 ceph-mon[75204]: pgmap v624: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v625: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:27 compute-0 ceph-mon[75204]: pgmap v625: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:27:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:27:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v626: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:28 compute-0 podman[241955]: 2025-12-03 21:27:28.144198402 +0000 UTC m=+0.080175979 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 03 21:27:28 compute-0 podman[241956]: 2025-12-03 21:27:28.145811745 +0000 UTC m=+0.075326310 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 21:27:29 compute-0 ceph-mon[75204]: pgmap v626: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v627: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:31 compute-0 ceph-mon[75204]: pgmap v627: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v628: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:33 compute-0 ceph-mon[75204]: pgmap v628: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v629: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:35 compute-0 ceph-mon[75204]: pgmap v629: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v630: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:37 compute-0 ceph-mon[75204]: pgmap v630: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v631: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:39 compute-0 ceph-mon[75204]: pgmap v631: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v632: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:27:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2440275908' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:27:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2440275908' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:27:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4164745870' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:27:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4164745870' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/2440275908' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/2440275908' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/4164745870' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/4164745870' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:27:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2522042389' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:27:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:27:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2522042389' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:27:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:41 compute-0 ceph-mon[75204]: pgmap v632: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:41 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/2522042389' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:27:41 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/2522042389' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:27:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v633: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:43 compute-0 ceph-mon[75204]: pgmap v633: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v634: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:45 compute-0 podman[241995]: 2025-12-03 21:27:45.197886324 +0000 UTC m=+0.135401889 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:27:45 compute-0 ceph-mon[75204]: pgmap v634: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:45 compute-0 nova_compute[241566]: 2025-12-03 21:27:45.966 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:27:46 compute-0 nova_compute[241566]: 2025-12-03 21:27:46.032 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:27:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v635: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:47 compute-0 ceph-mon[75204]: pgmap v635: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:47 compute-0 sudo[242022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:27:47 compute-0 sudo[242022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:47 compute-0 sudo[242022]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:47 compute-0 sudo[242047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:27:47 compute-0 sudo[242047]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v636: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:48 compute-0 sudo[242047]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:27:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:27:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:27:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:27:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:27:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:27:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:27:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:27:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:27:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:27:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:27:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:27:48 compute-0 sudo[242103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:27:48 compute-0 sudo[242103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:48 compute-0 sudo[242103]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:27:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:27:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:27:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:27:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:27:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:27:48 compute-0 sudo[242128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:27:48 compute-0 sudo[242128]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:48 compute-0 podman[242165]: 2025-12-03 21:27:48.854518226 +0000 UTC m=+0.068483556 container create 58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_lovelace, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:27:48 compute-0 systemd[1]: Started libpod-conmon-58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f.scope.
Dec 03 21:27:48 compute-0 podman[242165]: 2025-12-03 21:27:48.82107419 +0000 UTC m=+0.035039590 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:27:48 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:27:48.931 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:27:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:27:48.931 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:27:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:27:48.931 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:27:48 compute-0 podman[242165]: 2025-12-03 21:27:48.942140793 +0000 UTC m=+0.156106133 container init 58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_lovelace, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:27:48 compute-0 podman[242165]: 2025-12-03 21:27:48.949442769 +0000 UTC m=+0.163408079 container start 58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_lovelace, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:27:48 compute-0 systemd[1]: libpod-58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f.scope: Deactivated successfully.
Dec 03 21:27:48 compute-0 heuristic_lovelace[242182]: 167 167
Dec 03 21:27:48 compute-0 conmon[242182]: conmon 58a70c924b236018e868 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f.scope/container/memory.events
Dec 03 21:27:48 compute-0 podman[242165]: 2025-12-03 21:27:48.989559333 +0000 UTC m=+0.203524664 container attach 58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_lovelace, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:27:48 compute-0 podman[242165]: 2025-12-03 21:27:48.991956498 +0000 UTC m=+0.205921848 container died 58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_lovelace, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:27:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-e85094be78aefbcb6596165ef701712c57ca4c7902f787eff45f465ff6d7d288-merged.mount: Deactivated successfully.
Dec 03 21:27:49 compute-0 podman[242165]: 2025-12-03 21:27:49.042245155 +0000 UTC m=+0.256210505 container remove 58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_lovelace, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:27:49 compute-0 systemd[1]: libpod-conmon-58a70c924b236018e86875c0e40d42c7057afa647bbbcb8fec56489e8e8d2e9f.scope: Deactivated successfully.
Dec 03 21:27:49 compute-0 podman[242207]: 2025-12-03 21:27:49.253363461 +0000 UTC m=+0.059597168 container create d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_grothendieck, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:27:49 compute-0 systemd[1]: Started libpod-conmon-d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25.scope.
Dec 03 21:27:49 compute-0 podman[242207]: 2025-12-03 21:27:49.224051846 +0000 UTC m=+0.030285613 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:27:49 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec015648c3d23ce025c00e02ff31eb6eca23e7afeac5eff696405a400ad37d28/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec015648c3d23ce025c00e02ff31eb6eca23e7afeac5eff696405a400ad37d28/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec015648c3d23ce025c00e02ff31eb6eca23e7afeac5eff696405a400ad37d28/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec015648c3d23ce025c00e02ff31eb6eca23e7afeac5eff696405a400ad37d28/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec015648c3d23ce025c00e02ff31eb6eca23e7afeac5eff696405a400ad37d28/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:49 compute-0 podman[242207]: 2025-12-03 21:27:49.352472966 +0000 UTC m=+0.158706683 container init d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_grothendieck, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:27:49 compute-0 podman[242207]: 2025-12-03 21:27:49.36793554 +0000 UTC m=+0.174169247 container start d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_grothendieck, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default)
Dec 03 21:27:49 compute-0 podman[242207]: 2025-12-03 21:27:49.37166933 +0000 UTC m=+0.177903037 container attach d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_grothendieck, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:27:49 compute-0 ceph-mon[75204]: pgmap v636: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:49 compute-0 compassionate_grothendieck[242224]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:27:49 compute-0 compassionate_grothendieck[242224]: --> All data devices are unavailable
Dec 03 21:27:49 compute-0 systemd[1]: libpod-d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25.scope: Deactivated successfully.
Dec 03 21:27:49 compute-0 podman[242244]: 2025-12-03 21:27:49.929110264 +0000 UTC m=+0.025460103 container died d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_grothendieck, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:27:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v637: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-ec015648c3d23ce025c00e02ff31eb6eca23e7afeac5eff696405a400ad37d28-merged.mount: Deactivated successfully.
Dec 03 21:27:50 compute-0 podman[242244]: 2025-12-03 21:27:50.175962727 +0000 UTC m=+0.272312566 container remove d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_grothendieck, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:27:50 compute-0 systemd[1]: libpod-conmon-d96c4cbcc4089b4d64fcee62c17bd90104aee0bafa736e0aa0fbd231030d5b25.scope: Deactivated successfully.
Dec 03 21:27:50 compute-0 sudo[242128]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:50 compute-0 sudo[242261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:27:50 compute-0 sudo[242261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:50 compute-0 sudo[242261]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:50 compute-0 sudo[242286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:27:50 compute-0 sudo[242286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:50 compute-0 podman[242322]: 2025-12-03 21:27:50.725124129 +0000 UTC m=+0.044870923 container create e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_shockley, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 03 21:27:50 compute-0 systemd[1]: Started libpod-conmon-e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe.scope.
Dec 03 21:27:50 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:50 compute-0 podman[242322]: 2025-12-03 21:27:50.706181263 +0000 UTC m=+0.025928057 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:27:50 compute-0 podman[242322]: 2025-12-03 21:27:50.808782401 +0000 UTC m=+0.128529245 container init e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_shockley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:27:50 compute-0 podman[242322]: 2025-12-03 21:27:50.821999245 +0000 UTC m=+0.141746039 container start e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_shockley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:27:50 compute-0 podman[242322]: 2025-12-03 21:27:50.825400116 +0000 UTC m=+0.145146910 container attach e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_shockley, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:27:50 compute-0 suspicious_shockley[242339]: 167 167
Dec 03 21:27:50 compute-0 systemd[1]: libpod-e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe.scope: Deactivated successfully.
Dec 03 21:27:50 compute-0 conmon[242339]: conmon e0df29ee4d6f26fdb74f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe.scope/container/memory.events
Dec 03 21:27:50 compute-0 podman[242322]: 2025-12-03 21:27:50.82814784 +0000 UTC m=+0.147894644 container died e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_shockley, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:27:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-a282e5f90dcf35d7813f428bf3b98eada23ba9f7b38a4299337c425edc8d6cdf-merged.mount: Deactivated successfully.
Dec 03 21:27:50 compute-0 podman[242322]: 2025-12-03 21:27:50.863351133 +0000 UTC m=+0.183097927 container remove e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=suspicious_shockley, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:27:50 compute-0 systemd[1]: libpod-conmon-e0df29ee4d6f26fdb74f09509940cd3d4688d184714cd6a007c6ce7d4483f6fe.scope: Deactivated successfully.
Dec 03 21:27:51 compute-0 podman[242362]: 2025-12-03 21:27:51.068715645 +0000 UTC m=+0.060540474 container create 263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_knuth, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:27:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:51 compute-0 systemd[1]: Started libpod-conmon-263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700.scope.
Dec 03 21:27:51 compute-0 podman[242362]: 2025-12-03 21:27:51.038390602 +0000 UTC m=+0.030215481 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:27:51 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7461cacf73c387f7e03822bd3e011d95ffc75428187337829c1a4f714da1eb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7461cacf73c387f7e03822bd3e011d95ffc75428187337829c1a4f714da1eb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7461cacf73c387f7e03822bd3e011d95ffc75428187337829c1a4f714da1eb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ab7461cacf73c387f7e03822bd3e011d95ffc75428187337829c1a4f714da1eb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:51 compute-0 podman[242362]: 2025-12-03 21:27:51.185766961 +0000 UTC m=+0.177591820 container init 263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_knuth, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:27:51 compute-0 podman[242362]: 2025-12-03 21:27:51.196750065 +0000 UTC m=+0.188574884 container start 263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_knuth, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:27:51 compute-0 podman[242362]: 2025-12-03 21:27:51.201042919 +0000 UTC m=+0.192867708 container attach 263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_knuth, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030)
Dec 03 21:27:51 compute-0 exciting_knuth[242379]: {
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:     "0": [
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:         {
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "devices": [
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "/dev/loop3"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             ],
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_name": "ceph_lv0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_size": "21470642176",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "name": "ceph_lv0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "tags": {
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cluster_name": "ceph",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.crush_device_class": "",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.encrypted": "0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.objectstore": "bluestore",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osd_id": "0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.type": "block",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.vdo": "0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.with_tpm": "0"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             },
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "type": "block",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "vg_name": "ceph_vg0"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:         }
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:     ],
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:     "1": [
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:         {
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "devices": [
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "/dev/loop4"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             ],
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_name": "ceph_lv1",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_size": "21470642176",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "name": "ceph_lv1",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "tags": {
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cluster_name": "ceph",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.crush_device_class": "",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.encrypted": "0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.objectstore": "bluestore",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osd_id": "1",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.type": "block",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.vdo": "0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.with_tpm": "0"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             },
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "type": "block",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "vg_name": "ceph_vg1"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:         }
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:     ],
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:     "2": [
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:         {
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "devices": [
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "/dev/loop5"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             ],
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_name": "ceph_lv2",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_size": "21470642176",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "name": "ceph_lv2",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "tags": {
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.cluster_name": "ceph",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.crush_device_class": "",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.encrypted": "0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.objectstore": "bluestore",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osd_id": "2",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.type": "block",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.vdo": "0",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:                 "ceph.with_tpm": "0"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             },
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "type": "block",
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:             "vg_name": "ceph_vg2"
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:         }
Dec 03 21:27:51 compute-0 exciting_knuth[242379]:     ]
Dec 03 21:27:51 compute-0 exciting_knuth[242379]: }
Dec 03 21:27:51 compute-0 systemd[1]: libpod-263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700.scope: Deactivated successfully.
Dec 03 21:27:51 compute-0 podman[242362]: 2025-12-03 21:27:51.589391494 +0000 UTC m=+0.581216273 container died 263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_knuth, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:27:51 compute-0 ceph-mon[75204]: pgmap v637: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:27:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:27:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:27:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:27:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:27:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:27:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-ab7461cacf73c387f7e03822bd3e011d95ffc75428187337829c1a4f714da1eb-merged.mount: Deactivated successfully.
Dec 03 21:27:51 compute-0 podman[242362]: 2025-12-03 21:27:51.978202251 +0000 UTC m=+0.970027060 container remove 263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_knuth, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:27:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v638: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:52 compute-0 systemd[1]: libpod-conmon-263fe4cb7fbcdf17fe003f40faa10ea157c0c1e97a1c5a280e85c962536fc700.scope: Deactivated successfully.
Dec 03 21:27:52 compute-0 sudo[242286]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:52 compute-0 sudo[242401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:27:52 compute-0 sudo[242401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:52 compute-0 sudo[242401]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:52 compute-0 sudo[242426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:27:52 compute-0 sudo[242426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:52 compute-0 podman[242463]: 2025-12-03 21:27:52.586320442 +0000 UTC m=+0.029671066 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:27:53 compute-0 podman[242463]: 2025-12-03 21:27:53.061115871 +0000 UTC m=+0.504466435 container create 1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_bose, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:27:53 compute-0 ceph-mon[75204]: pgmap v638: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:53 compute-0 systemd[1]: Started libpod-conmon-1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3.scope.
Dec 03 21:27:53 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:53 compute-0 podman[242463]: 2025-12-03 21:27:53.529177591 +0000 UTC m=+0.972528145 container init 1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_bose, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:27:53 compute-0 podman[242463]: 2025-12-03 21:27:53.538911842 +0000 UTC m=+0.982262396 container start 1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_bose, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:27:53 compute-0 podman[242463]: 2025-12-03 21:27:53.542437586 +0000 UTC m=+0.985788150 container attach 1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_bose, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:27:53 compute-0 affectionate_bose[242479]: 167 167
Dec 03 21:27:53 compute-0 systemd[1]: libpod-1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3.scope: Deactivated successfully.
Dec 03 21:27:53 compute-0 podman[242463]: 2025-12-03 21:27:53.546969098 +0000 UTC m=+0.990319642 container died 1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_bose, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS)
Dec 03 21:27:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-743c617d13dfb2e948496e6445f5561087b75c4ab999398645e3660727f90881-merged.mount: Deactivated successfully.
Dec 03 21:27:53 compute-0 podman[242463]: 2025-12-03 21:27:53.595164969 +0000 UTC m=+1.038515523 container remove 1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_bose, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:27:53 compute-0 systemd[1]: libpod-conmon-1f091b8d5c1b0009830959b0df94234eac2fe58645fdfc79cc377baebf1144e3.scope: Deactivated successfully.
Dec 03 21:27:53 compute-0 podman[242503]: 2025-12-03 21:27:53.863750895 +0000 UTC m=+0.082020769 container create c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_murdock, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:27:53 compute-0 systemd[1]: Started libpod-conmon-c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31.scope.
Dec 03 21:27:53 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:27:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eff1dfaa65ef2601c3b820557f9c1f06a80c6a503e0711de9a44b1c23bb5e571/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eff1dfaa65ef2601c3b820557f9c1f06a80c6a503e0711de9a44b1c23bb5e571/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eff1dfaa65ef2601c3b820557f9c1f06a80c6a503e0711de9a44b1c23bb5e571/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eff1dfaa65ef2601c3b820557f9c1f06a80c6a503e0711de9a44b1c23bb5e571/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:27:53 compute-0 podman[242503]: 2025-12-03 21:27:53.843539094 +0000 UTC m=+0.061808978 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:27:53 compute-0 podman[242503]: 2025-12-03 21:27:53.958983316 +0000 UTC m=+0.177253260 container init c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_murdock, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:27:53 compute-0 podman[242503]: 2025-12-03 21:27:53.972719264 +0000 UTC m=+0.190989108 container start c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_murdock, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:27:53 compute-0 podman[242503]: 2025-12-03 21:27:53.976125075 +0000 UTC m=+0.194395009 container attach c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_murdock, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:27:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v639: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:54 compute-0 rsyslogd[1006]: imjournal: 1738 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 03 21:27:54 compute-0 lvm[242601]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:27:54 compute-0 lvm[242599]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:27:54 compute-0 lvm[242601]: VG ceph_vg2 finished
Dec 03 21:27:54 compute-0 lvm[242599]: VG ceph_vg0 finished
Dec 03 21:27:54 compute-0 lvm[242600]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:27:54 compute-0 lvm[242600]: VG ceph_vg1 finished
Dec 03 21:27:54 compute-0 cranky_murdock[242520]: {}
Dec 03 21:27:54 compute-0 podman[242503]: 2025-12-03 21:27:54.865914402 +0000 UTC m=+1.084184256 container died c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_murdock, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:27:54 compute-0 systemd[1]: libpod-c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31.scope: Deactivated successfully.
Dec 03 21:27:54 compute-0 systemd[1]: libpod-c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31.scope: Consumed 1.519s CPU time.
Dec 03 21:27:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-eff1dfaa65ef2601c3b820557f9c1f06a80c6a503e0711de9a44b1c23bb5e571-merged.mount: Deactivated successfully.
Dec 03 21:27:54 compute-0 podman[242503]: 2025-12-03 21:27:54.922208061 +0000 UTC m=+1.140477955 container remove c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_murdock, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:27:54 compute-0 systemd[1]: libpod-conmon-c98d8b29547b82ad31345e2f04d599c48c0e2d46cc55814a8273e9c06cad9d31.scope: Deactivated successfully.
Dec 03 21:27:54 compute-0 sudo[242426]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:27:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:27:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:27:54 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:27:55 compute-0 sudo[242617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:27:55 compute-0 sudo[242617]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:27:55 compute-0 sudo[242617]: pam_unix(sudo:session): session closed for user root
Dec 03 21:27:55 compute-0 ceph-mon[75204]: pgmap v639: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:27:55 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:27:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v640: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:27:58 compute-0 ceph-mon[75204]: pgmap v640: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v641: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:59 compute-0 ceph-mon[75204]: pgmap v641: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:27:59 compute-0 podman[242643]: 2025-12-03 21:27:59.155044661 +0000 UTC m=+0.076948374 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:27:59 compute-0 podman[242642]: 2025-12-03 21:27:59.155296577 +0000 UTC m=+0.085235035 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:28:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v642: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:01 compute-0 ceph-mon[75204]: pgmap v642: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:28:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3095 writes, 13K keys, 3095 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.01 MB/s
                                           Cumulative WAL: 3095 writes, 3095 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1284 writes, 5579 keys, 1284 commit groups, 1.0 writes per commit group, ingest: 5.74 MB, 0.01 MB/s
                                           Interval WAL: 1284 writes, 1284 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    101.0      0.10              0.04         6    0.017       0      0       0.0       0.0
                                             L6      1/0    4.66 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.4    146.5    119.4      0.21              0.09         5    0.042     16K   2270       0.0       0.0
                                            Sum      1/0    4.66 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.4     98.3    113.4      0.31              0.12        11    0.028     16K   2270       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.5    120.5    122.8      0.16              0.06         6    0.026     10K   1497       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0    146.5    119.4      0.21              0.09         5    0.042     16K   2270       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    106.2      0.10              0.04         5    0.019       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     10.4      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.010, interval 0.004
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.03 GB write, 0.03 MB/s write, 0.03 GB read, 0.03 MB/s read, 0.3 seconds
                                           Interval compaction: 0.02 GB write, 0.03 MB/s write, 0.02 GB read, 0.03 MB/s read, 0.2 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56170d6e38d0#2 capacity: 308.00 MB usage: 1.32 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 7.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(90,1.17 MB,0.378814%) FilterBlock(12,54.30 KB,0.0172157%) IndexBlock(12,106.89 KB,0.0338914%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 03 21:28:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v643: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:03 compute-0 ceph-mon[75204]: pgmap v643: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v644: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:05 compute-0 ceph-mon[75204]: pgmap v644: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v645: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:07 compute-0 ceph-mon[75204]: pgmap v645: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v646: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:09 compute-0 ceph-mon[75204]: pgmap v646: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v647: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:11 compute-0 ceph-mon[75204]: pgmap v647: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v648: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:13 compute-0 ceph-mon[75204]: pgmap v648: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v649: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:15 compute-0 ceph-mon[75204]: pgmap v649: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v650: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:16 compute-0 podman[242682]: 2025-12-03 21:28:16.345968416 +0000 UTC m=+0.096019923 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.554 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.555 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.556 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.556 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.618 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.619 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.619 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.620 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.620 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.620 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.620 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.621 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.621 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.649 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.650 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.650 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.650 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:28:16 compute-0 nova_compute[241566]: 2025-12-03 21:28:16.651 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:28:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:28:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1296139348' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:28:17 compute-0 nova_compute[241566]: 2025-12-03 21:28:17.192 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.541s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:28:17 compute-0 ceph-mon[75204]: pgmap v650: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:17 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1296139348' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:28:17 compute-0 nova_compute[241566]: 2025-12-03 21:28:17.375 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:28:17 compute-0 nova_compute[241566]: 2025-12-03 21:28:17.376 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5292MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:28:17 compute-0 nova_compute[241566]: 2025-12-03 21:28:17.376 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:28:17 compute-0 nova_compute[241566]: 2025-12-03 21:28:17.376 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:28:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec 03 21:28:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4130373903' entity='client.openstack' cmd={"prefix": "version", "format": "json"} : dispatch
Dec 03 21:28:17 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14308 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 03 21:28:17 compute-0 ceph-mgr[75500]: [volumes INFO volumes.module] Starting _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec 03 21:28:17 compute-0 ceph-mgr[75500]: [volumes INFO volumes.module] Finishing _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec 03 21:28:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v651: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:18 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/4130373903' entity='client.openstack' cmd={"prefix": "version", "format": "json"} : dispatch
Dec 03 21:28:18 compute-0 ceph-mon[75204]: from='client.14308 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 03 21:28:19 compute-0 ceph-mon[75204]: pgmap v651: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:19 compute-0 nova_compute[241566]: 2025-12-03 21:28:19.627 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:28:19 compute-0 nova_compute[241566]: 2025-12-03 21:28:19.627 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:28:19 compute-0 nova_compute[241566]: 2025-12-03 21:28:19.654 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:28:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v652: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:28:20 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2724371676' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:28:20 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2724371676' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:28:20 compute-0 nova_compute[241566]: 2025-12-03 21:28:20.256 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.602s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:28:20 compute-0 nova_compute[241566]: 2025-12-03 21:28:20.264 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:28:20 compute-0 nova_compute[241566]: 2025-12-03 21:28:20.279 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:28:20 compute-0 nova_compute[241566]: 2025-12-03 21:28:20.281 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:28:20 compute-0 nova_compute[241566]: 2025-12-03 21:28:20.281 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:28:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:28:21
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.data', '.mgr', 'images', 'backups', 'vms', 'cephfs.cephfs.meta', 'volumes']
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:28:21 compute-0 ceph-mon[75204]: pgmap v652: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:28:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:28:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v653: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:23 compute-0 ceph-mon[75204]: pgmap v653: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v654: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:25 compute-0 ceph-mon[75204]: pgmap v654: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v655: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:27 compute-0 ceph-mon[75204]: pgmap v655: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:28:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:28:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v656: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:29 compute-0 ceph-mon[75204]: pgmap v656: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v657: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:30 compute-0 podman[242753]: 2025-12-03 21:28:30.110727727 +0000 UTC m=+0.049182869 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 03 21:28:30 compute-0 podman[242752]: 2025-12-03 21:28:30.157642623 +0000 UTC m=+0.097636676 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 03 21:28:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:31 compute-0 ceph-mon[75204]: pgmap v657: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v658: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:33 compute-0 ceph-mon[75204]: pgmap v658: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec 03 21:28:33 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1939712706' entity='client.openstack' cmd={"prefix": "version", "format": "json"} : dispatch
Dec 03 21:28:33 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14316 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 03 21:28:33 compute-0 ceph-mgr[75500]: [volumes INFO volumes.module] Starting _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec 03 21:28:33 compute-0 ceph-mgr[75500]: [volumes INFO volumes.module] Finishing _cmd_fs_volume_ls(format:json, prefix:fs volume ls) < ""
Dec 03 21:28:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v659: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:34 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1939712706' entity='client.openstack' cmd={"prefix": "version", "format": "json"} : dispatch
Dec 03 21:28:34 compute-0 ceph-mon[75204]: from='client.14316 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 03 21:28:35 compute-0 ceph-mon[75204]: pgmap v659: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v660: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:37 compute-0 ceph-mon[75204]: pgmap v660: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v661: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:39 compute-0 ceph-mon[75204]: pgmap v661: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v662: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:41 compute-0 ceph-mon[75204]: pgmap v662: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v663: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:43 compute-0 ceph-mon[75204]: pgmap v663: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v664: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:45 compute-0 ceph-mon[75204]: pgmap v664: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v665: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:47 compute-0 podman[242789]: 2025-12-03 21:28:47.192766164 +0000 UTC m=+0.125235135 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 03 21:28:47 compute-0 ceph-mon[75204]: pgmap v665: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v666: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:28:48.931 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:28:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:28:48.932 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:28:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:28:48.932 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:28:49 compute-0 ceph-mon[75204]: pgmap v666: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v667: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:51 compute-0 ceph-mon[75204]: pgmap v667: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:28:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:28:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:28:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:28:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:28:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:28:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v668: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:53 compute-0 ceph-mon[75204]: pgmap v668: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v669: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:55 compute-0 sudo[242815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:28:55 compute-0 sudo[242815]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:55 compute-0 sudo[242815]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:55 compute-0 sudo[242840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 check-host
Dec 03 21:28:55 compute-0 sudo[242840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:55 compute-0 ceph-mon[75204]: pgmap v669: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:55 compute-0 sudo[242840]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:28:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:28:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:28:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:28:55 compute-0 sudo[242885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:28:55 compute-0 sudo[242885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:55 compute-0 sudo[242885]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:55 compute-0 sudo[242910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:28:55 compute-0 sudo[242910]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:28:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v670: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:56 compute-0 sudo[242910]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:28:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:28:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:28:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:28:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:28:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:28:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:28:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:28:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:28:56 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:28:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:28:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:28:56 compute-0 sudo[242966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:28:56 compute-0 sudo[242966]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:56 compute-0 sudo[242966]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:28:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:28:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:28:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:28:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:28:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:28:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:28:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:28:56 compute-0 sudo[242991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:28:56 compute-0 sudo[242991]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:57 compute-0 podman[243027]: 2025-12-03 21:28:57.033659261 +0000 UTC m=+0.053819333 container create 0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=priceless_darwin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:28:57 compute-0 systemd[1]: Started libpod-conmon-0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f.scope.
Dec 03 21:28:57 compute-0 podman[243027]: 2025-12-03 21:28:57.009923775 +0000 UTC m=+0.030083927 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:28:57 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:28:57 compute-0 podman[243027]: 2025-12-03 21:28:57.1221112 +0000 UTC m=+0.142271332 container init 0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=priceless_darwin, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:28:57 compute-0 podman[243027]: 2025-12-03 21:28:57.128700216 +0000 UTC m=+0.148860288 container start 0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=priceless_darwin, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:28:57 compute-0 podman[243027]: 2025-12-03 21:28:57.131503421 +0000 UTC m=+0.151663523 container attach 0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=priceless_darwin, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:28:57 compute-0 priceless_darwin[243043]: 167 167
Dec 03 21:28:57 compute-0 systemd[1]: libpod-0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f.scope: Deactivated successfully.
Dec 03 21:28:57 compute-0 podman[243027]: 2025-12-03 21:28:57.137367518 +0000 UTC m=+0.157527610 container died 0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=priceless_darwin, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:28:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-f5f6358899f3133da8a5343efdcd5b64b22c5914666a4c0165208804cb718252-merged.mount: Deactivated successfully.
Dec 03 21:28:57 compute-0 podman[243027]: 2025-12-03 21:28:57.183737601 +0000 UTC m=+0.203897703 container remove 0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=priceless_darwin, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:28:57 compute-0 systemd[1]: libpod-conmon-0ba070095caf0c36343510392a9548b9f14b271bb56c227e0fa1f30ce1be5c1f.scope: Deactivated successfully.
Dec 03 21:28:57 compute-0 podman[243067]: 2025-12-03 21:28:57.385857914 +0000 UTC m=+0.060552343 container create f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_jones, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 21:28:57 compute-0 systemd[1]: Started libpod-conmon-f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a.scope.
Dec 03 21:28:57 compute-0 podman[243067]: 2025-12-03 21:28:57.364348388 +0000 UTC m=+0.039042817 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:28:57 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:28:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3ff4585d0ecaf8af235aeeb9fc1b6aff6fa13966aca0beca0c0d90713a34a64/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3ff4585d0ecaf8af235aeeb9fc1b6aff6fa13966aca0beca0c0d90713a34a64/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3ff4585d0ecaf8af235aeeb9fc1b6aff6fa13966aca0beca0c0d90713a34a64/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3ff4585d0ecaf8af235aeeb9fc1b6aff6fa13966aca0beca0c0d90713a34a64/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3ff4585d0ecaf8af235aeeb9fc1b6aff6fa13966aca0beca0c0d90713a34a64/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:57 compute-0 podman[243067]: 2025-12-03 21:28:57.501910712 +0000 UTC m=+0.176605151 container init f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_jones, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:28:57 compute-0 podman[243067]: 2025-12-03 21:28:57.518420114 +0000 UTC m=+0.193114513 container start f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_jones, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3)
Dec 03 21:28:57 compute-0 podman[243067]: 2025-12-03 21:28:57.521946389 +0000 UTC m=+0.196640828 container attach f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_jones, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:28:57 compute-0 ceph-mon[75204]: pgmap v670: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:58 compute-0 reverent_jones[243083]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:28:58 compute-0 reverent_jones[243083]: --> All data devices are unavailable
Dec 03 21:28:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v671: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:58 compute-0 systemd[1]: libpod-f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a.scope: Deactivated successfully.
Dec 03 21:28:58 compute-0 podman[243067]: 2025-12-03 21:28:58.111856829 +0000 UTC m=+0.786551228 container died f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_jones, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030)
Dec 03 21:28:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-b3ff4585d0ecaf8af235aeeb9fc1b6aff6fa13966aca0beca0c0d90713a34a64-merged.mount: Deactivated successfully.
Dec 03 21:28:58 compute-0 podman[243067]: 2025-12-03 21:28:58.153868274 +0000 UTC m=+0.828562703 container remove f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_jones, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030)
Dec 03 21:28:58 compute-0 systemd[1]: libpod-conmon-f4c179b384f6ddd0d30417087335825261bf5d8f76e486f5b129ef7409c1c83a.scope: Deactivated successfully.
Dec 03 21:28:58 compute-0 sudo[242991]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:58 compute-0 sudo[243114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:28:58 compute-0 sudo[243114]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:58 compute-0 sudo[243114]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:58 compute-0 sudo[243139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:28:58 compute-0 sudo[243139]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:58 compute-0 podman[243176]: 2025-12-03 21:28:58.734907247 +0000 UTC m=+0.051284605 container create f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_williams, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:28:58 compute-0 systemd[1]: Started libpod-conmon-f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437.scope.
Dec 03 21:28:58 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:28:58 compute-0 podman[243176]: 2025-12-03 21:28:58.713784711 +0000 UTC m=+0.030162159 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:28:58 compute-0 podman[243176]: 2025-12-03 21:28:58.813199184 +0000 UTC m=+0.129576572 container init f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_williams, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle)
Dec 03 21:28:58 compute-0 podman[243176]: 2025-12-03 21:28:58.819103012 +0000 UTC m=+0.135480410 container start f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_williams, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:28:58 compute-0 podman[243176]: 2025-12-03 21:28:58.822704199 +0000 UTC m=+0.139081577 container attach f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_williams, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle)
Dec 03 21:28:58 compute-0 reverent_williams[243193]: 167 167
Dec 03 21:28:58 compute-0 systemd[1]: libpod-f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437.scope: Deactivated successfully.
Dec 03 21:28:58 compute-0 podman[243176]: 2025-12-03 21:28:58.824054834 +0000 UTC m=+0.140432202 container died f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_williams, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 03 21:28:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-e95bc63dd77a00cc5dd70f99ff5146ac94f111e9eb36c87415e9c768b1a19609-merged.mount: Deactivated successfully.
Dec 03 21:28:58 compute-0 podman[243176]: 2025-12-03 21:28:58.865218827 +0000 UTC m=+0.181596215 container remove f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_williams, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle)
Dec 03 21:28:58 compute-0 systemd[1]: libpod-conmon-f605237b63cd2c4c476ff530437580d1b767e2e66409a491957500e2fb0a1437.scope: Deactivated successfully.
Dec 03 21:28:59 compute-0 podman[243218]: 2025-12-03 21:28:59.07852688 +0000 UTC m=+0.065198277 container create c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_kirch, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:28:59 compute-0 systemd[1]: Started libpod-conmon-c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500.scope.
Dec 03 21:28:59 compute-0 podman[243218]: 2025-12-03 21:28:59.057395415 +0000 UTC m=+0.044066882 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:28:59 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:28:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad62fbc2ac4f6ee6f75eecdd2a1f844f5f86fffa303a9ec100b432355535089b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad62fbc2ac4f6ee6f75eecdd2a1f844f5f86fffa303a9ec100b432355535089b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad62fbc2ac4f6ee6f75eecdd2a1f844f5f86fffa303a9ec100b432355535089b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad62fbc2ac4f6ee6f75eecdd2a1f844f5f86fffa303a9ec100b432355535089b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:28:59 compute-0 podman[243218]: 2025-12-03 21:28:59.192153484 +0000 UTC m=+0.178824921 container init c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:28:59 compute-0 podman[243218]: 2025-12-03 21:28:59.20616963 +0000 UTC m=+0.192840997 container start c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:28:59 compute-0 podman[243218]: 2025-12-03 21:28:59.20881754 +0000 UTC m=+0.195489017 container attach c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_kirch, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]: {
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:     "0": [
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:         {
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "devices": [
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "/dev/loop3"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             ],
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_name": "ceph_lv0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_size": "21470642176",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "name": "ceph_lv0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "tags": {
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cluster_name": "ceph",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.crush_device_class": "",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.encrypted": "0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.objectstore": "bluestore",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osd_id": "0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.type": "block",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.vdo": "0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.with_tpm": "0"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             },
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "type": "block",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "vg_name": "ceph_vg0"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:         }
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:     ],
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:     "1": [
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:         {
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "devices": [
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "/dev/loop4"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             ],
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_name": "ceph_lv1",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_size": "21470642176",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "name": "ceph_lv1",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "tags": {
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cluster_name": "ceph",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.crush_device_class": "",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.encrypted": "0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.objectstore": "bluestore",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osd_id": "1",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.type": "block",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.vdo": "0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.with_tpm": "0"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             },
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "type": "block",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "vg_name": "ceph_vg1"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:         }
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:     ],
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:     "2": [
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:         {
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "devices": [
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "/dev/loop5"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             ],
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_name": "ceph_lv2",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_size": "21470642176",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "name": "ceph_lv2",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "tags": {
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.cluster_name": "ceph",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.crush_device_class": "",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.encrypted": "0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.objectstore": "bluestore",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osd_id": "2",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.type": "block",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.vdo": "0",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:                 "ceph.with_tpm": "0"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             },
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "type": "block",
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:             "vg_name": "ceph_vg2"
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:         }
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]:     ]
Dec 03 21:28:59 compute-0 vibrant_kirch[243235]: }
Dec 03 21:28:59 compute-0 systemd[1]: libpod-c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500.scope: Deactivated successfully.
Dec 03 21:28:59 compute-0 podman[243218]: 2025-12-03 21:28:59.583214908 +0000 UTC m=+0.569886275 container died c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_kirch, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:28:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-ad62fbc2ac4f6ee6f75eecdd2a1f844f5f86fffa303a9ec100b432355535089b-merged.mount: Deactivated successfully.
Dec 03 21:28:59 compute-0 podman[243218]: 2025-12-03 21:28:59.620915368 +0000 UTC m=+0.607586735 container remove c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_kirch, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle)
Dec 03 21:28:59 compute-0 systemd[1]: libpod-conmon-c1be0ff08d1033a6ef8da9aa1db1c1ef507c4f15d04b3d5261b5c9cfd8812500.scope: Deactivated successfully.
Dec 03 21:28:59 compute-0 sudo[243139]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:59 compute-0 ceph-mon[75204]: pgmap v671: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:28:59 compute-0 sudo[243255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:28:59 compute-0 sudo[243255]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:28:59 compute-0 sudo[243255]: pam_unix(sudo:session): session closed for user root
Dec 03 21:28:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:28:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3654616200' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:28:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:28:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3654616200' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:28:59 compute-0 sudo[243280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:28:59 compute-0 sudo[243280]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:29:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v672: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:00 compute-0 podman[243316]: 2025-12-03 21:29:00.1825469 +0000 UTC m=+0.062120904 container create 63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_solomon, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:29:00 compute-0 systemd[1]: Started libpod-conmon-63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca.scope.
Dec 03 21:29:00 compute-0 podman[243316]: 2025-12-03 21:29:00.157625703 +0000 UTC m=+0.037199767 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:29:00 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:29:00 compute-0 podman[243316]: 2025-12-03 21:29:00.287936323 +0000 UTC m=+0.167510407 container init 63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_solomon, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:29:00 compute-0 podman[243316]: 2025-12-03 21:29:00.30238481 +0000 UTC m=+0.181958794 container start 63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_solomon, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:29:00 compute-0 sweet_solomon[243335]: 167 167
Dec 03 21:29:00 compute-0 podman[243316]: 2025-12-03 21:29:00.308163385 +0000 UTC m=+0.187737369 container attach 63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_solomon, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:29:00 compute-0 podman[243316]: 2025-12-03 21:29:00.319133369 +0000 UTC m=+0.198707383 container died 63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_solomon, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:29:00 compute-0 systemd[1]: libpod-63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca.scope: Deactivated successfully.
Dec 03 21:29:00 compute-0 podman[243334]: 2025-12-03 21:29:00.344836497 +0000 UTC m=+0.105148687 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 03 21:29:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-82ab0babd6da154e361790e9cdd08d422229631dd6566ca3a062d61f5d3d76bd-merged.mount: Deactivated successfully.
Dec 03 21:29:00 compute-0 podman[243316]: 2025-12-03 21:29:00.370743342 +0000 UTC m=+0.250317316 container remove 63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sweet_solomon, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:29:00 compute-0 podman[243330]: 2025-12-03 21:29:00.373544657 +0000 UTC m=+0.136309683 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:29:00 compute-0 systemd[1]: libpod-conmon-63e56dfd2e059a894c93073544766848bc39d1edbaa3b580f99b2ee91e7403ca.scope: Deactivated successfully.
Dec 03 21:29:00 compute-0 podman[243389]: 2025-12-03 21:29:00.536439039 +0000 UTC m=+0.048203372 container create 59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_darwin, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:29:00 compute-0 systemd[1]: Started libpod-conmon-59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d.scope.
Dec 03 21:29:00 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:29:00 compute-0 podman[243389]: 2025-12-03 21:29:00.516926757 +0000 UTC m=+0.028691110 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:29:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1944ae2c67b4d552ae65ec1b33c08d997756e58e4d568629eb8516c60448bb7/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:29:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1944ae2c67b4d552ae65ec1b33c08d997756e58e4d568629eb8516c60448bb7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:29:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1944ae2c67b4d552ae65ec1b33c08d997756e58e4d568629eb8516c60448bb7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:29:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b1944ae2c67b4d552ae65ec1b33c08d997756e58e4d568629eb8516c60448bb7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:29:00 compute-0 podman[243389]: 2025-12-03 21:29:00.627166879 +0000 UTC m=+0.138931242 container init 59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_darwin, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 03 21:29:00 compute-0 podman[243389]: 2025-12-03 21:29:00.639616982 +0000 UTC m=+0.151381325 container start 59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_darwin, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:29:00 compute-0 podman[243389]: 2025-12-03 21:29:00.643304882 +0000 UTC m=+0.155069215 container attach 59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_darwin, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:29:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3654616200' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:29:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3654616200' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:29:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:01 compute-0 lvm[243485]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:29:01 compute-0 lvm[243485]: VG ceph_vg0 finished
Dec 03 21:29:01 compute-0 lvm[243486]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:29:01 compute-0 lvm[243486]: VG ceph_vg1 finished
Dec 03 21:29:01 compute-0 lvm[243487]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:29:01 compute-0 lvm[243487]: VG ceph_vg2 finished
Dec 03 21:29:01 compute-0 flamboyant_darwin[243406]: {}
Dec 03 21:29:01 compute-0 systemd[1]: libpod-59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d.scope: Deactivated successfully.
Dec 03 21:29:01 compute-0 systemd[1]: libpod-59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d.scope: Consumed 1.360s CPU time.
Dec 03 21:29:01 compute-0 podman[243389]: 2025-12-03 21:29:01.527740111 +0000 UTC m=+1.039504514 container died 59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_darwin, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle)
Dec 03 21:29:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-b1944ae2c67b4d552ae65ec1b33c08d997756e58e4d568629eb8516c60448bb7-merged.mount: Deactivated successfully.
Dec 03 21:29:01 compute-0 podman[243389]: 2025-12-03 21:29:01.583916565 +0000 UTC m=+1.095680898 container remove 59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_darwin, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:29:01 compute-0 systemd[1]: libpod-conmon-59b74b5f317a83bea41f09abecaca54fff546bc046b5813933a2e3a72cc7487d.scope: Deactivated successfully.
Dec 03 21:29:01 compute-0 sudo[243280]: pam_unix(sudo:session): session closed for user root
Dec 03 21:29:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:29:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:29:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:29:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:29:01 compute-0 ceph-mon[75204]: pgmap v672: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:29:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:29:01 compute-0 sudo[243502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:29:01 compute-0 sudo[243502]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:29:01 compute-0 sudo[243502]: pam_unix(sudo:session): session closed for user root
Dec 03 21:29:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v673: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:03 compute-0 ceph-mon[75204]: pgmap v673: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v674: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:05 compute-0 ceph-mon[75204]: pgmap v674: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v675: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:07 compute-0 ceph-mon[75204]: pgmap v675: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v676: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:09 compute-0 ceph-mon[75204]: pgmap v676: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:29:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:29:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v677: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #33. Immutable memtables: 0.
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.100794) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 33
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797351100827, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 2233, "num_deletes": 506, "total_data_size": 2250616, "memory_usage": 2298112, "flush_reason": "Manual Compaction"}
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #34: started
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797351125465, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 34, "file_size": 2179430, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12103, "largest_seqno": 14335, "table_properties": {"data_size": 2169902, "index_size": 5510, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2949, "raw_key_size": 21666, "raw_average_key_size": 18, "raw_value_size": 2148773, "raw_average_value_size": 1853, "num_data_blocks": 253, "num_entries": 1159, "num_filter_entries": 1159, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764797135, "oldest_key_time": 1764797135, "file_creation_time": 1764797351, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 34, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 24799 microseconds, and 10039 cpu microseconds.
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.125550) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #34: 2179430 bytes OK
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.125618) [db/memtable_list.cc:519] [default] Level-0 commit table #34 started
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.126783) [db/memtable_list.cc:722] [default] Level-0 commit table #34: memtable #1 done
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.126805) EVENT_LOG_v1 {"time_micros": 1764797351126798, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.126833) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 2240263, prev total WAL file size 2240263, number of live WAL files 2.
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000030.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.127846) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323532' seq:0, type:0; will stop at (end)
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [34(2128KB)], [32(4774KB)]
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797351127904, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [34], "files_L6": [32], "score": -1, "input_data_size": 7068903, "oldest_snapshot_seqno": -1}
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #35: 3363 keys, 5627071 bytes, temperature: kUnknown
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797351181177, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 35, "file_size": 5627071, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5601893, "index_size": 15661, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8453, "raw_key_size": 79454, "raw_average_key_size": 23, "raw_value_size": 5538757, "raw_average_value_size": 1646, "num_data_blocks": 678, "num_entries": 3363, "num_filter_entries": 3363, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764797351, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.181485) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 5627071 bytes
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.183003) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.4 rd, 105.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 4.7 +0.0 blob) out(5.4 +0.0 blob), read-write-amplify(5.8) write-amplify(2.6) OK, records in: 4388, records dropped: 1025 output_compression: NoCompression
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.183039) EVENT_LOG_v1 {"time_micros": 1764797351183021, "job": 14, "event": "compaction_finished", "compaction_time_micros": 53376, "compaction_time_cpu_micros": 27004, "output_level": 6, "num_output_files": 1, "total_output_size": 5627071, "num_input_records": 4388, "num_output_records": 3363, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000034.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797351183934, "job": 14, "event": "table_file_deletion", "file_number": 34}
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797351185717, "job": 14, "event": "table_file_deletion", "file_number": 32}
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.127719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.185796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.185805) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.185809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.185813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:29:11 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:29:11.185817) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:29:11 compute-0 ceph-mon[75204]: pgmap v677: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v678: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:13 compute-0 ceph-mon[75204]: pgmap v678: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v679: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:29:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:29:15 compute-0 ceph-mon[75204]: pgmap v679: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v680: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:17 compute-0 ceph-mon[75204]: pgmap v680: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v681: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:18 compute-0 podman[243527]: 2025-12-03 21:29:18.243792893 +0000 UTC m=+0.168903585 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 03 21:29:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:29:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:29:19 compute-0 ceph-mon[75204]: pgmap v681: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v682: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.271 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.272 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.296 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.297 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.297 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.310 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.310 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.311 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.311 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.311 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.311 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.311 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.311 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.312 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.340 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.341 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.342 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.342 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.343 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:29:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:29:20 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3362728199' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:29:20 compute-0 nova_compute[241566]: 2025-12-03 21:29:20.963 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.620s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:29:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.193 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.194 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5285MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.195 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.195 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:29:21
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['vms', 'images', 'volumes', '.mgr', 'cephfs.cephfs.data', 'backups', 'cephfs.cephfs.meta']
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [devicehealth INFO root] Check health
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.289 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.290 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.324 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:29:21 compute-0 ceph-mon[75204]: pgmap v682: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:21 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3362728199' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:29:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:29:21 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2368212542' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.859 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.871 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.891 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.892 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:29:21 compute-0 nova_compute[241566]: 2025-12-03 21:29:21.892 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:29:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:29:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v683: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:22 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2368212542' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:29:23 compute-0 ceph-mon[75204]: pgmap v683: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:24 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:29:24.065 151937 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '4a:b3:fa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:85:3a:67:f5:74'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 03 21:29:24 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:29:24.068 151937 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 03 21:29:24 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:29:24.070 151937 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f27c01e7-5b62-4209-a664-3ae50b74644d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 21:29:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v684: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:25 compute-0 ceph-mon[75204]: pgmap v684: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v685: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:29:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:29:27 compute-0 ceph-mon[75204]: pgmap v685: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v686: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:29 compute-0 ceph-mon[75204]: pgmap v686: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v687: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:31 compute-0 podman[243598]: 2025-12-03 21:29:31.1588533 +0000 UTC m=+0.080983800 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 03 21:29:31 compute-0 podman[243597]: 2025-12-03 21:29:31.167354358 +0000 UTC m=+0.092807167 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:29:31 compute-0 ceph-mon[75204]: pgmap v687: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v688: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:33 compute-0 ceph-mon[75204]: pgmap v688: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v689: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:35 compute-0 ceph-mon[75204]: pgmap v689: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v690: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:37 compute-0 ceph-mon[75204]: pgmap v690: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v691: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:39 compute-0 ceph-mon[75204]: pgmap v691: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v692: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:41 compute-0 ceph-mon[75204]: pgmap v692: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v693: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:43 compute-0 ceph-mon[75204]: pgmap v693: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v694: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:45 compute-0 ceph-mon[75204]: pgmap v694: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v695: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:47 compute-0 ceph-mon[75204]: pgmap v695: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v696: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:29:48.932 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:29:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:29:48.933 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:29:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:29:48.933 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:29:49 compute-0 podman[243635]: 2025-12-03 21:29:49.20255909 +0000 UTC m=+0.135929362 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 03 21:29:49 compute-0 ceph-mon[75204]: pgmap v696: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v697: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:29:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:29:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:29:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:29:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:29:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:29:51 compute-0 ceph-mon[75204]: pgmap v697: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v698: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:53 compute-0 ceph-mon[75204]: pgmap v698: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v699: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:55 compute-0 ceph-mon[75204]: pgmap v699: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:29:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v700: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:57 compute-0 ceph-mon[75204]: pgmap v700: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v701: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:29:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:29:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2897846278' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:29:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:29:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2897846278' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:30:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v702: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:00 compute-0 ceph-mon[75204]: pgmap v701: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/2897846278' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:30:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/2897846278' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:30:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:01 compute-0 ceph-mon[75204]: pgmap v702: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:01 compute-0 sudo[243662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:30:01 compute-0 sudo[243662]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:01 compute-0 sudo[243662]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:01 compute-0 podman[243686]: 2025-12-03 21:30:01.877001492 +0000 UTC m=+0.060639304 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 03 21:30:01 compute-0 podman[243687]: 2025-12-03 21:30:01.876971402 +0000 UTC m=+0.056459103 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 21:30:01 compute-0 sudo[243705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:30:01 compute-0 sudo[243705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v703: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:02 compute-0 sudo[243705]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0)
Dec 03 21:30:02 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} : dispatch
Dec 03 21:30:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:30:02 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:30:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:30:02 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:30:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:30:02 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:30:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:30:02 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:30:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:30:02 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:30:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:30:02 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:30:02 compute-0 sudo[243782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:30:02 compute-0 sudo[243782]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:02 compute-0 sudo[243782]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:02 compute-0 sudo[243807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:30:02 compute-0 sudo[243807]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:03 compute-0 podman[243844]: 2025-12-03 21:30:03.058429906 +0000 UTC m=+0.061553400 container create 707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_satoshi, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 03 21:30:03 compute-0 systemd[1]: Started libpod-conmon-707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa.scope.
Dec 03 21:30:03 compute-0 podman[243844]: 2025-12-03 21:30:03.0268452 +0000 UTC m=+0.029968744 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:30:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:30:03 compute-0 podman[243844]: 2025-12-03 21:30:03.172778229 +0000 UTC m=+0.175901743 container init 707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_satoshi, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:30:03 compute-0 podman[243844]: 2025-12-03 21:30:03.185373256 +0000 UTC m=+0.188496720 container start 707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_satoshi, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:30:03 compute-0 podman[243844]: 2025-12-03 21:30:03.188466419 +0000 UTC m=+0.191589933 container attach 707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_satoshi, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:30:03 compute-0 magical_satoshi[243861]: 167 167
Dec 03 21:30:03 compute-0 systemd[1]: libpod-707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa.scope: Deactivated successfully.
Dec 03 21:30:03 compute-0 conmon[243861]: conmon 707f37c1ef48842071d6 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa.scope/container/memory.events
Dec 03 21:30:03 compute-0 podman[243844]: 2025-12-03 21:30:03.196365 +0000 UTC m=+0.199488464 container died 707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_satoshi, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:30:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-b25f69a5565dbd5d9357856f032cb14ba98991c4e756430b52ad5ce4df631da5-merged.mount: Deactivated successfully.
Dec 03 21:30:03 compute-0 podman[243844]: 2025-12-03 21:30:03.235322654 +0000 UTC m=+0.238446118 container remove 707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=magical_satoshi, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS)
Dec 03 21:30:03 compute-0 ceph-mon[75204]: pgmap v703: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:03 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} : dispatch
Dec 03 21:30:03 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:30:03 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:30:03 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:30:03 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:30:03 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:30:03 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:30:03 compute-0 systemd[1]: libpod-conmon-707f37c1ef48842071d606f145bbeea02611403fc3b7fd74061e8dcb92c283fa.scope: Deactivated successfully.
Dec 03 21:30:03 compute-0 podman[243884]: 2025-12-03 21:30:03.396012717 +0000 UTC m=+0.040389222 container create 4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_nash, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:30:03 compute-0 systemd[1]: Started libpod-conmon-4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf.scope.
Dec 03 21:30:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:30:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53e685c0bacb3d852b22f8783ecb06881ac8b0acd8b49013815569da6027b8f4/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53e685c0bacb3d852b22f8783ecb06881ac8b0acd8b49013815569da6027b8f4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53e685c0bacb3d852b22f8783ecb06881ac8b0acd8b49013815569da6027b8f4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53e685c0bacb3d852b22f8783ecb06881ac8b0acd8b49013815569da6027b8f4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53e685c0bacb3d852b22f8783ecb06881ac8b0acd8b49013815569da6027b8f4/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:03 compute-0 podman[243884]: 2025-12-03 21:30:03.377828671 +0000 UTC m=+0.022205206 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:30:03 compute-0 podman[243884]: 2025-12-03 21:30:03.477740327 +0000 UTC m=+0.122116822 container init 4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_nash, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:30:03 compute-0 podman[243884]: 2025-12-03 21:30:03.486915333 +0000 UTC m=+0.131291818 container start 4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_nash, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, ceph=True)
Dec 03 21:30:03 compute-0 podman[243884]: 2025-12-03 21:30:03.489445971 +0000 UTC m=+0.133822476 container attach 4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_nash, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:30:03 compute-0 laughing_nash[243900]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:30:03 compute-0 laughing_nash[243900]: --> All data devices are unavailable
Dec 03 21:30:03 compute-0 systemd[1]: libpod-4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf.scope: Deactivated successfully.
Dec 03 21:30:04 compute-0 conmon[243900]: conmon 4bbb5972c58771df1f77 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf.scope/container/memory.events
Dec 03 21:30:04 compute-0 podman[243884]: 2025-12-03 21:30:04.001917777 +0000 UTC m=+0.646294282 container died 4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_nash, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:30:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-53e685c0bacb3d852b22f8783ecb06881ac8b0acd8b49013815569da6027b8f4-merged.mount: Deactivated successfully.
Dec 03 21:30:04 compute-0 podman[243884]: 2025-12-03 21:30:04.08865127 +0000 UTC m=+0.733027765 container remove 4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_nash, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:30:04 compute-0 systemd[1]: libpod-conmon-4bbb5972c58771df1f77866be743d991028cf3f85773e5a39227ef4b8f85ddcf.scope: Deactivated successfully.
Dec 03 21:30:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v704: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:04 compute-0 sudo[243807]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:04 compute-0 sudo[243933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:30:04 compute-0 sudo[243933]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:04 compute-0 sudo[243933]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:04 compute-0 sudo[243958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:30:04 compute-0 sudo[243958]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:04 compute-0 podman[243995]: 2025-12-03 21:30:04.584224713 +0000 UTC m=+0.042393797 container create 0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_neumann, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:30:04 compute-0 systemd[1]: Started libpod-conmon-0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782.scope.
Dec 03 21:30:04 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:30:04 compute-0 podman[243995]: 2025-12-03 21:30:04.565186743 +0000 UTC m=+0.023355867 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:30:04 compute-0 podman[243995]: 2025-12-03 21:30:04.670316889 +0000 UTC m=+0.128485983 container init 0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_neumann, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3)
Dec 03 21:30:04 compute-0 podman[243995]: 2025-12-03 21:30:04.681396735 +0000 UTC m=+0.139565819 container start 0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_neumann, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:30:04 compute-0 podman[243995]: 2025-12-03 21:30:04.684625262 +0000 UTC m=+0.142794356 container attach 0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_neumann, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:30:04 compute-0 heuristic_neumann[244011]: 167 167
Dec 03 21:30:04 compute-0 systemd[1]: libpod-0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782.scope: Deactivated successfully.
Dec 03 21:30:04 compute-0 podman[243995]: 2025-12-03 21:30:04.686316697 +0000 UTC m=+0.144485801 container died 0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_neumann, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:30:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-e0338eaaabe2d4a2b564097b8dc20c6c8f618a9d30d89c07da02485009b55a17-merged.mount: Deactivated successfully.
Dec 03 21:30:04 compute-0 podman[243995]: 2025-12-03 21:30:04.736309267 +0000 UTC m=+0.194478371 container remove 0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=heuristic_neumann, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default)
Dec 03 21:30:04 compute-0 systemd[1]: libpod-conmon-0b94060d708efc7b28d50c60d9882ff5771b1ee00266d56f7e4b8a2677abd782.scope: Deactivated successfully.
Dec 03 21:30:05 compute-0 podman[244034]: 2025-12-03 21:30:04.946463245 +0000 UTC m=+0.025843053 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:30:05 compute-0 podman[244034]: 2025-12-03 21:30:05.568943497 +0000 UTC m=+0.648323295 container create e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_chebyshev, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:30:05 compute-0 ceph-mon[75204]: pgmap v704: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:05 compute-0 systemd[1]: Started libpod-conmon-e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c.scope.
Dec 03 21:30:05 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:30:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a22191375e053be6c53ac5d1cbdb4a625ae9a77ac2dc566fa51501f9e31fd1c2/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a22191375e053be6c53ac5d1cbdb4a625ae9a77ac2dc566fa51501f9e31fd1c2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a22191375e053be6c53ac5d1cbdb4a625ae9a77ac2dc566fa51501f9e31fd1c2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a22191375e053be6c53ac5d1cbdb4a625ae9a77ac2dc566fa51501f9e31fd1c2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:05 compute-0 podman[244034]: 2025-12-03 21:30:05.678209294 +0000 UTC m=+0.757589102 container init e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_chebyshev, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:30:05 compute-0 podman[244034]: 2025-12-03 21:30:05.698438646 +0000 UTC m=+0.777818424 container start e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_chebyshev, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:30:05 compute-0 podman[244034]: 2025-12-03 21:30:05.702648449 +0000 UTC m=+0.782028277 container attach e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_chebyshev, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]: {
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:     "0": [
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:         {
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "devices": [
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "/dev/loop3"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             ],
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_name": "ceph_lv0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_size": "21470642176",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "name": "ceph_lv0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "tags": {
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cluster_name": "ceph",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.crush_device_class": "",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.encrypted": "0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.objectstore": "bluestore",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osd_id": "0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.type": "block",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.vdo": "0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.with_tpm": "0"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             },
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "type": "block",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "vg_name": "ceph_vg0"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:         }
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:     ],
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:     "1": [
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:         {
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "devices": [
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "/dev/loop4"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             ],
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_name": "ceph_lv1",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_size": "21470642176",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "name": "ceph_lv1",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "tags": {
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cluster_name": "ceph",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.crush_device_class": "",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.encrypted": "0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.objectstore": "bluestore",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osd_id": "1",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.type": "block",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.vdo": "0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.with_tpm": "0"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             },
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "type": "block",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "vg_name": "ceph_vg1"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:         }
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:     ],
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:     "2": [
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:         {
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "devices": [
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "/dev/loop5"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             ],
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_name": "ceph_lv2",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_size": "21470642176",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "name": "ceph_lv2",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "tags": {
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.cluster_name": "ceph",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.crush_device_class": "",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.encrypted": "0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.objectstore": "bluestore",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osd_id": "2",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.type": "block",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.vdo": "0",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:                 "ceph.with_tpm": "0"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             },
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "type": "block",
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:             "vg_name": "ceph_vg2"
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:         }
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]:     ]
Dec 03 21:30:06 compute-0 agitated_chebyshev[244051]: }
Dec 03 21:30:06 compute-0 systemd[1]: libpod-e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c.scope: Deactivated successfully.
Dec 03 21:30:06 compute-0 podman[244034]: 2025-12-03 21:30:06.085085652 +0000 UTC m=+1.164465440 container died e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_chebyshev, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True)
Dec 03 21:30:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v705: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-a22191375e053be6c53ac5d1cbdb4a625ae9a77ac2dc566fa51501f9e31fd1c2-merged.mount: Deactivated successfully.
Dec 03 21:30:06 compute-0 podman[244034]: 2025-12-03 21:30:06.149081186 +0000 UTC m=+1.228460954 container remove e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=agitated_chebyshev, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:30:06 compute-0 systemd[1]: libpod-conmon-e385f06d816ce4dc28410596964e5e143f8af34dda4630723489ba51e0a6d86c.scope: Deactivated successfully.
Dec 03 21:30:06 compute-0 sudo[243958]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:06 compute-0 sudo[244070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:30:06 compute-0 sudo[244070]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:06 compute-0 sudo[244070]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:06 compute-0 sudo[244095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:30:06 compute-0 sudo[244095]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:06 compute-0 podman[244132]: 2025-12-03 21:30:06.736545641 +0000 UTC m=+0.082549623 container create 8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_black, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:30:06 compute-0 podman[244132]: 2025-12-03 21:30:06.681226069 +0000 UTC m=+0.027230141 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:30:06 compute-0 systemd[1]: Started libpod-conmon-8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d.scope.
Dec 03 21:30:06 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:30:06 compute-0 podman[244132]: 2025-12-03 21:30:06.886865628 +0000 UTC m=+0.232869640 container init 8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_black, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3)
Dec 03 21:30:06 compute-0 podman[244132]: 2025-12-03 21:30:06.892601071 +0000 UTC m=+0.238605043 container start 8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_black, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:30:06 compute-0 podman[244132]: 2025-12-03 21:30:06.896283449 +0000 UTC m=+0.242287431 container attach 8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_black, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:30:06 compute-0 focused_black[244149]: 167 167
Dec 03 21:30:06 compute-0 systemd[1]: libpod-8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d.scope: Deactivated successfully.
Dec 03 21:30:06 compute-0 podman[244132]: 2025-12-03 21:30:06.897789859 +0000 UTC m=+0.243793831 container died 8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_black, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:30:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-32550418eea9e443147418caf4bb4e741b3cafd4ee0fa7d2cb6e2b63de22219b-merged.mount: Deactivated successfully.
Dec 03 21:30:06 compute-0 podman[244132]: 2025-12-03 21:30:06.936889767 +0000 UTC m=+0.282893739 container remove 8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=focused_black, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:30:06 compute-0 systemd[1]: libpod-conmon-8599f88c3de5956601f3106a5bed47faa5e8265367ba87439643eb755524837d.scope: Deactivated successfully.
Dec 03 21:30:07 compute-0 podman[244173]: 2025-12-03 21:30:07.099260955 +0000 UTC m=+0.046365102 container create e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_faraday, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:30:07 compute-0 systemd[1]: Started libpod-conmon-e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08.scope.
Dec 03 21:30:07 compute-0 podman[244173]: 2025-12-03 21:30:07.076347292 +0000 UTC m=+0.023451489 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:30:07 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:30:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fe0d2d32986483ae8a4fd29a8fcc99a740cacab280de21422f3a4ed53c7bb74/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fe0d2d32986483ae8a4fd29a8fcc99a740cacab280de21422f3a4ed53c7bb74/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fe0d2d32986483ae8a4fd29a8fcc99a740cacab280de21422f3a4ed53c7bb74/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fe0d2d32986483ae8a4fd29a8fcc99a740cacab280de21422f3a4ed53c7bb74/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:30:07 compute-0 podman[244173]: 2025-12-03 21:30:07.194291251 +0000 UTC m=+0.141395448 container init e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_faraday, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:30:07 compute-0 podman[244173]: 2025-12-03 21:30:07.209731325 +0000 UTC m=+0.156835462 container start e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:30:07 compute-0 podman[244173]: 2025-12-03 21:30:07.215190871 +0000 UTC m=+0.162295068 container attach e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_faraday, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:30:07 compute-0 ceph-mon[75204]: pgmap v705: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:07 compute-0 lvm[244268]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:30:07 compute-0 lvm[244268]: VG ceph_vg0 finished
Dec 03 21:30:07 compute-0 lvm[244267]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:30:07 compute-0 lvm[244267]: VG ceph_vg1 finished
Dec 03 21:30:07 compute-0 lvm[244270]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:30:07 compute-0 lvm[244270]: VG ceph_vg2 finished
Dec 03 21:30:08 compute-0 funny_faraday[244189]: {}
Dec 03 21:30:08 compute-0 systemd[1]: libpod-e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08.scope: Deactivated successfully.
Dec 03 21:30:08 compute-0 podman[244173]: 2025-12-03 21:30:08.04139333 +0000 UTC m=+0.988497477 container died e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_faraday, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:30:08 compute-0 systemd[1]: libpod-e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08.scope: Consumed 1.370s CPU time.
Dec 03 21:30:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-5fe0d2d32986483ae8a4fd29a8fcc99a740cacab280de21422f3a4ed53c7bb74-merged.mount: Deactivated successfully.
Dec 03 21:30:08 compute-0 podman[244173]: 2025-12-03 21:30:08.097976086 +0000 UTC m=+1.045080243 container remove e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=funny_faraday, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:30:08 compute-0 systemd[1]: libpod-conmon-e35ecd12ea9449ebaa69606fa6df626fef7109ddc1183df9905abd836958ba08.scope: Deactivated successfully.
Dec 03 21:30:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v706: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:08 compute-0 sudo[244095]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:30:08 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:30:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:30:08 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:30:08 compute-0 sudo[244285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:30:08 compute-0 sudo[244285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:30:08 compute-0 sudo[244285]: pam_unix(sudo:session): session closed for user root
Dec 03 21:30:09 compute-0 ceph-mon[75204]: pgmap v706: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:30:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:30:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v707: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:11 compute-0 ceph-mon[75204]: pgmap v707: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v708: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:13 compute-0 ceph-mon[75204]: pgmap v708: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v709: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:15 compute-0 ceph-mon[75204]: pgmap v709: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v710: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:17 compute-0 ceph-mon[75204]: pgmap v710: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v711: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:19 compute-0 ceph-mon[75204]: pgmap v711: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v712: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:20 compute-0 podman[244310]: 2025-12-03 21:30:20.174990815 +0000 UTC m=+0.113642425 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:30:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:30:21
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['volumes', 'vms', 'cephfs.cephfs.meta', 'backups', '.mgr', 'cephfs.cephfs.data', 'images']
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:30:21 compute-0 ceph-mon[75204]: pgmap v712: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.894 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.895 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.895 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.895 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:30:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.940 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.940 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.940 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.941 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.941 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.941 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.941 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.941 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.942 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.974 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.974 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.975 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.975 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:30:21 compute-0 nova_compute[241566]: 2025-12-03 21:30:21.975 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:30:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v713: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:30:22 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2744860036' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:30:22 compute-0 nova_compute[241566]: 2025-12-03 21:30:22.487 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:30:22 compute-0 nova_compute[241566]: 2025-12-03 21:30:22.719 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:30:22 compute-0 nova_compute[241566]: 2025-12-03 21:30:22.721 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5299MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:30:22 compute-0 nova_compute[241566]: 2025-12-03 21:30:22.721 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:30:22 compute-0 nova_compute[241566]: 2025-12-03 21:30:22.722 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:30:22 compute-0 nova_compute[241566]: 2025-12-03 21:30:22.794 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:30:22 compute-0 nova_compute[241566]: 2025-12-03 21:30:22.795 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:30:22 compute-0 nova_compute[241566]: 2025-12-03 21:30:22.820 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:30:23 compute-0 ceph-mon[75204]: pgmap v713: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:23 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2744860036' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:30:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:30:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/116423800' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:30:23 compute-0 nova_compute[241566]: 2025-12-03 21:30:23.380 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:30:23 compute-0 nova_compute[241566]: 2025-12-03 21:30:23.386 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:30:23 compute-0 nova_compute[241566]: 2025-12-03 21:30:23.415 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:30:23 compute-0 nova_compute[241566]: 2025-12-03 21:30:23.417 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:30:23 compute-0 nova_compute[241566]: 2025-12-03 21:30:23.417 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.695s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:30:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v714: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:24 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/116423800' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:30:25 compute-0 ceph-mon[75204]: pgmap v714: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v715: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:27 compute-0 ceph-mon[75204]: pgmap v715: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.7344613959254126e-06 of space, bias 4.0, pg target 0.002081353675110495 quantized to 16 (current 16)
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:30:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:30:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v716: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:29 compute-0 ceph-mon[75204]: pgmap v716: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v717: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:31 compute-0 ceph-mon[75204]: pgmap v717: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v718: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:32 compute-0 podman[244382]: 2025-12-03 21:30:32.159540879 +0000 UTC m=+0.093125255 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 03 21:30:32 compute-0 podman[244381]: 2025-12-03 21:30:32.167349689 +0000 UTC m=+0.107427509 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 21:30:33 compute-0 ceph-mon[75204]: pgmap v718: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v719: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:35 compute-0 ceph-mon[75204]: pgmap v719: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v720: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:37 compute-0 ceph-mon[75204]: pgmap v720: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v721: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:39 compute-0 ceph-mon[75204]: pgmap v721: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v722: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:41 compute-0 ceph-mon[75204]: pgmap v722: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:41 compute-0 sshd-session[244422]: Connection reset by authenticating user root 45.135.232.92 port 63142 [preauth]
Dec 03 21:30:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v723: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:43 compute-0 ceph-mon[75204]: pgmap v723: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v724: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:44 compute-0 sshd-session[244424]: Connection reset by authenticating user root 45.135.232.92 port 63158 [preauth]
Dec 03 21:30:45 compute-0 ceph-mon[75204]: pgmap v724: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:46 compute-0 sshd-session[244426]: Invalid user onlime_r from 45.135.232.92 port 23868
Dec 03 21:30:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v725: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:46 compute-0 sshd-session[244426]: Connection reset by invalid user onlime_r 45.135.232.92 port 23868 [preauth]
Dec 03 21:30:47 compute-0 ceph-mon[75204]: pgmap v725: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v726: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:48 compute-0 sshd-session[244428]: Invalid user admin from 45.135.232.92 port 23878
Dec 03 21:30:48 compute-0 sshd-session[244428]: Connection reset by invalid user admin 45.135.232.92 port 23878 [preauth]
Dec 03 21:30:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:30:48.934 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:30:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:30:48.935 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:30:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:30:48.935 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:30:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e62 do_prune osdmap full prune enabled
Dec 03 21:30:49 compute-0 ceph-mon[75204]: pgmap v726: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e63 e63: 3 total, 3 up, 3 in
Dec 03 21:30:49 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e63: 3 total, 3 up, 3 in
Dec 03 21:30:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v728: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:50 compute-0 sshd-session[244430]: Invalid user admin from 45.135.232.92 port 23884
Dec 03 21:30:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e63 do_prune osdmap full prune enabled
Dec 03 21:30:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e64 e64: 3 total, 3 up, 3 in
Dec 03 21:30:50 compute-0 ceph-mon[75204]: osdmap e63: 3 total, 3 up, 3 in
Dec 03 21:30:50 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e64: 3 total, 3 up, 3 in
Dec 03 21:30:50 compute-0 podman[244432]: 2025-12-03 21:30:50.610313874 +0000 UTC m=+0.098892758 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 03 21:30:50 compute-0 sshd-session[244430]: Connection reset by invalid user admin 45.135.232.92 port 23884 [preauth]
Dec 03 21:30:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e64 do_prune osdmap full prune enabled
Dec 03 21:30:51 compute-0 ceph-mon[75204]: pgmap v728: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:30:51 compute-0 ceph-mon[75204]: osdmap e64: 3 total, 3 up, 3 in
Dec 03 21:30:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e65 e65: 3 total, 3 up, 3 in
Dec 03 21:30:51 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e65: 3 total, 3 up, 3 in
Dec 03 21:30:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:30:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:30:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:30:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:30:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:30:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:30:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v731: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 4.2 KiB/s rd, 682 B/s wr, 5 op/s
Dec 03 21:30:52 compute-0 ceph-mon[75204]: osdmap e65: 3 total, 3 up, 3 in
Dec 03 21:30:53 compute-0 ceph-mon[75204]: pgmap v731: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 4.2 KiB/s rd, 682 B/s wr, 5 op/s
Dec 03 21:30:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v732: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 4.2 KiB/s rd, 682 B/s wr, 5 op/s
Dec 03 21:30:54 compute-0 sshd-session[244456]: Received disconnect from 193.46.255.244 port 51587:11:  [preauth]
Dec 03 21:30:54 compute-0 sshd-session[244456]: Disconnected from authenticating user root 193.46.255.244 port 51587 [preauth]
Dec 03 21:30:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e65 do_prune osdmap full prune enabled
Dec 03 21:30:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e66 e66: 3 total, 3 up, 3 in
Dec 03 21:30:55 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e66: 3 total, 3 up, 3 in
Dec 03 21:30:56 compute-0 ceph-mon[75204]: pgmap v732: 177 pgs: 177 active+clean; 451 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 4.2 KiB/s rd, 682 B/s wr, 5 op/s
Dec 03 21:30:56 compute-0 ceph-mon[75204]: osdmap e66: 3 total, 3 up, 3 in
Dec 03 21:30:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e66 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:30:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v734: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 6.8 MiB/s wr, 63 op/s
Dec 03 21:30:58 compute-0 ceph-mon[75204]: pgmap v734: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 6.8 MiB/s wr, 63 op/s
Dec 03 21:30:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v735: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 35 KiB/s rd, 5.4 MiB/s wr, 50 op/s
Dec 03 21:30:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:30:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/553097494' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:30:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:30:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/553097494' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:31:00 compute-0 ceph-mon[75204]: pgmap v735: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 35 KiB/s rd, 5.4 MiB/s wr, 50 op/s
Dec 03 21:31:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/553097494' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:31:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/553097494' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:31:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v736: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 4.8 MiB/s wr, 40 op/s
Dec 03 21:31:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e66 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e66 do_prune osdmap full prune enabled
Dec 03 21:31:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e67 e67: 3 total, 3 up, 3 in
Dec 03 21:31:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e67: 3 total, 3 up, 3 in
Dec 03 21:31:02 compute-0 ceph-mon[75204]: pgmap v736: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 4.8 MiB/s wr, 40 op/s
Dec 03 21:31:02 compute-0 ceph-mon[75204]: osdmap e67: 3 total, 3 up, 3 in
Dec 03 21:31:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v738: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 5.1 MiB/s wr, 43 op/s
Dec 03 21:31:03 compute-0 podman[244458]: 2025-12-03 21:31:03.139033423 +0000 UTC m=+0.077522827 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 03 21:31:03 compute-0 podman[244459]: 2025-12-03 21:31:03.172355236 +0000 UTC m=+0.095750996 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:31:04 compute-0 ceph-mon[75204]: pgmap v738: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 5.1 MiB/s wr, 43 op/s
Dec 03 21:31:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v739: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 4.5 MiB/s wr, 37 op/s
Dec 03 21:31:06 compute-0 ceph-mon[75204]: pgmap v739: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 4.5 MiB/s wr, 37 op/s
Dec 03 21:31:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v740: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:07 compute-0 ceph-mon[75204]: pgmap v740: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v741: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:08 compute-0 sudo[244499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:31:08 compute-0 sudo[244499]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:08 compute-0 sudo[244499]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:08 compute-0 sudo[244524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:31:08 compute-0 sudo[244524]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:09 compute-0 sudo[244524]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:31:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:31:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:31:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:31:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:31:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:31:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:31:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:31:09 compute-0 ceph-mon[75204]: pgmap v741: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:31:09 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:31:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:31:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:31:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:31:09 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:31:09 compute-0 sudo[244580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:31:09 compute-0 sudo[244580]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:09 compute-0 sudo[244580]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:09 compute-0 sudo[244606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:31:09 compute-0 sudo[244606]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:09 compute-0 podman[244643]: 2025-12-03 21:31:09.758852664 +0000 UTC m=+0.068566078 container create 9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_jepsen, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:31:09 compute-0 systemd[1]: Started libpod-conmon-9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e.scope.
Dec 03 21:31:09 compute-0 podman[244643]: 2025-12-03 21:31:09.7322148 +0000 UTC m=+0.041928294 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:31:09 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:31:09 compute-0 podman[244643]: 2025-12-03 21:31:09.864304317 +0000 UTC m=+0.174017821 container init 9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_jepsen, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:31:09 compute-0 podman[244643]: 2025-12-03 21:31:09.880320516 +0000 UTC m=+0.190033930 container start 9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_jepsen, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:31:09 compute-0 podman[244643]: 2025-12-03 21:31:09.884862778 +0000 UTC m=+0.194576212 container attach 9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_jepsen, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, ceph=True)
Dec 03 21:31:09 compute-0 modest_jepsen[244660]: 167 167
Dec 03 21:31:09 compute-0 systemd[1]: libpod-9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e.scope: Deactivated successfully.
Dec 03 21:31:09 compute-0 conmon[244660]: conmon 9219fdbf87b88b185b85 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e.scope/container/memory.events
Dec 03 21:31:09 compute-0 podman[244643]: 2025-12-03 21:31:09.891068154 +0000 UTC m=+0.200781568 container died 9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_jepsen, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:31:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-0e01889006dddf9c16c350ebf1ae83b9ea7274ed9c063266865401d5919dd8d9-merged.mount: Deactivated successfully.
Dec 03 21:31:09 compute-0 podman[244643]: 2025-12-03 21:31:09.953995868 +0000 UTC m=+0.263709312 container remove 9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_jepsen, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:31:09 compute-0 systemd[1]: libpod-conmon-9219fdbf87b88b185b85db227db888abd1e88108843eff7d01b9379c0971fc8e.scope: Deactivated successfully.
Dec 03 21:31:10 compute-0 podman[244684]: 2025-12-03 21:31:10.141070838 +0000 UTC m=+0.037822643 container create 8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zen_shirley, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:31:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v742: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:10 compute-0 systemd[1]: Started libpod-conmon-8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80.scope.
Dec 03 21:31:10 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:31:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ec0ae966a4bd0db1743b76bc61541bbf8eaada5833bd50e4471e522b6924611/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ec0ae966a4bd0db1743b76bc61541bbf8eaada5833bd50e4471e522b6924611/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ec0ae966a4bd0db1743b76bc61541bbf8eaada5833bd50e4471e522b6924611/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ec0ae966a4bd0db1743b76bc61541bbf8eaada5833bd50e4471e522b6924611/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ec0ae966a4bd0db1743b76bc61541bbf8eaada5833bd50e4471e522b6924611/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:10 compute-0 podman[244684]: 2025-12-03 21:31:10.215163983 +0000 UTC m=+0.111915788 container init 8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zen_shirley, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:31:10 compute-0 podman[244684]: 2025-12-03 21:31:10.124402432 +0000 UTC m=+0.021154237 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:31:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:31:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:31:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:31:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:31:10 compute-0 podman[244684]: 2025-12-03 21:31:10.226690862 +0000 UTC m=+0.123442657 container start 8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zen_shirley, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:31:10 compute-0 podman[244684]: 2025-12-03 21:31:10.230523134 +0000 UTC m=+0.127274919 container attach 8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zen_shirley, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:31:10 compute-0 zen_shirley[244700]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:31:10 compute-0 zen_shirley[244700]: --> All data devices are unavailable
Dec 03 21:31:10 compute-0 systemd[1]: libpod-8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80.scope: Deactivated successfully.
Dec 03 21:31:10 compute-0 podman[244684]: 2025-12-03 21:31:10.778389965 +0000 UTC m=+0.675141760 container died 8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zen_shirley, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:31:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-7ec0ae966a4bd0db1743b76bc61541bbf8eaada5833bd50e4471e522b6924611-merged.mount: Deactivated successfully.
Dec 03 21:31:10 compute-0 podman[244684]: 2025-12-03 21:31:10.84876761 +0000 UTC m=+0.745519445 container remove 8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=zen_shirley, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:31:10 compute-0 systemd[1]: libpod-conmon-8dd3c8bed57dec60541e3c9d8a45344246835a48c6aba45ad0f428da98b84d80.scope: Deactivated successfully.
Dec 03 21:31:10 compute-0 sudo[244606]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:10 compute-0 sudo[244731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:31:10 compute-0 sudo[244731]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:10 compute-0 sudo[244731]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:11 compute-0 sudo[244756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:31:11 compute-0 sudo[244756]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:11 compute-0 ceph-mon[75204]: pgmap v742: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:11 compute-0 podman[244794]: 2025-12-03 21:31:11.361138211 +0000 UTC m=+0.055000584 container create e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_colden, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:31:11 compute-0 systemd[1]: Started libpod-conmon-e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e.scope.
Dec 03 21:31:11 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:31:11 compute-0 podman[244794]: 2025-12-03 21:31:11.339521422 +0000 UTC m=+0.033383805 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:31:11 compute-0 podman[244794]: 2025-12-03 21:31:11.44887984 +0000 UTC m=+0.142742293 container init e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_colden, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:31:11 compute-0 podman[244794]: 2025-12-03 21:31:11.462032702 +0000 UTC m=+0.155895105 container start e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_colden, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle)
Dec 03 21:31:11 compute-0 podman[244794]: 2025-12-03 21:31:11.466513692 +0000 UTC m=+0.160376095 container attach e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_colden, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:31:11 compute-0 goofy_colden[244810]: 167 167
Dec 03 21:31:11 compute-0 systemd[1]: libpod-e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e.scope: Deactivated successfully.
Dec 03 21:31:11 compute-0 podman[244794]: 2025-12-03 21:31:11.468978478 +0000 UTC m=+0.162840881 container died e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_colden, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:31:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-20b3608875c643d0cffc7ed6a7395efad972ecd482128762451ffad39de5efc3-merged.mount: Deactivated successfully.
Dec 03 21:31:11 compute-0 podman[244794]: 2025-12-03 21:31:11.522783459 +0000 UTC m=+0.216645832 container remove e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=goofy_colden, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:31:11 compute-0 systemd[1]: libpod-conmon-e16dbd4904e958ee9eba5c69dc9be12043440e6ca67fbe817ea8c9bb8dce296e.scope: Deactivated successfully.
Dec 03 21:31:11 compute-0 podman[244834]: 2025-12-03 21:31:11.76443484 +0000 UTC m=+0.064728434 container create dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_mestorf, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:31:11 compute-0 systemd[1]: Started libpod-conmon-dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4.scope.
Dec 03 21:31:11 compute-0 podman[244834]: 2025-12-03 21:31:11.741707072 +0000 UTC m=+0.042000676 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:31:11 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:31:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4bf0a6b47274cc1fc7a96b4970f99aa5ebae36e27797274a92dedc5d66128de/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4bf0a6b47274cc1fc7a96b4970f99aa5ebae36e27797274a92dedc5d66128de/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4bf0a6b47274cc1fc7a96b4970f99aa5ebae36e27797274a92dedc5d66128de/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4bf0a6b47274cc1fc7a96b4970f99aa5ebae36e27797274a92dedc5d66128de/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:11 compute-0 podman[244834]: 2025-12-03 21:31:11.89777065 +0000 UTC m=+0.198064294 container init dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_mestorf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:31:11 compute-0 podman[244834]: 2025-12-03 21:31:11.909537925 +0000 UTC m=+0.209831519 container start dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_mestorf, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle)
Dec 03 21:31:11 compute-0 podman[244834]: 2025-12-03 21:31:11.914012885 +0000 UTC m=+0.214306479 container attach dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_mestorf, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:31:11 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:31:11.954 151937 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '4a:b3:fa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:85:3a:67:f5:74'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 03 21:31:11 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:31:11.960 151937 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 03 21:31:11 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:31:11.962 151937 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f27c01e7-5b62-4209-a664-3ae50b74644d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 21:31:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v743: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:12 compute-0 silly_mestorf[244850]: {
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:     "0": [
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:         {
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "devices": [
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "/dev/loop3"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             ],
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_name": "ceph_lv0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_size": "21470642176",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "name": "ceph_lv0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "tags": {
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cluster_name": "ceph",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.crush_device_class": "",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.encrypted": "0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.objectstore": "bluestore",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osd_id": "0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.type": "block",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.vdo": "0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.with_tpm": "0"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             },
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "type": "block",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "vg_name": "ceph_vg0"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:         }
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:     ],
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:     "1": [
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:         {
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "devices": [
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "/dev/loop4"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             ],
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_name": "ceph_lv1",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_size": "21470642176",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "name": "ceph_lv1",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "tags": {
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cluster_name": "ceph",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.crush_device_class": "",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.encrypted": "0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.objectstore": "bluestore",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osd_id": "1",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.type": "block",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.vdo": "0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.with_tpm": "0"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             },
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "type": "block",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "vg_name": "ceph_vg1"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:         }
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:     ],
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:     "2": [
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:         {
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "devices": [
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "/dev/loop5"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             ],
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_name": "ceph_lv2",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_size": "21470642176",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "name": "ceph_lv2",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "tags": {
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.cluster_name": "ceph",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.crush_device_class": "",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.encrypted": "0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.objectstore": "bluestore",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osd_id": "2",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.type": "block",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.vdo": "0",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:                 "ceph.with_tpm": "0"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             },
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "type": "block",
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:             "vg_name": "ceph_vg2"
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:         }
Dec 03 21:31:12 compute-0 silly_mestorf[244850]:     ]
Dec 03 21:31:12 compute-0 silly_mestorf[244850]: }
Dec 03 21:31:12 compute-0 systemd[1]: libpod-dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4.scope: Deactivated successfully.
Dec 03 21:31:12 compute-0 podman[244834]: 2025-12-03 21:31:12.307000109 +0000 UTC m=+0.607293693 container died dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_mestorf, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 03 21:31:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-f4bf0a6b47274cc1fc7a96b4970f99aa5ebae36e27797274a92dedc5d66128de-merged.mount: Deactivated successfully.
Dec 03 21:31:12 compute-0 podman[244834]: 2025-12-03 21:31:12.358293763 +0000 UTC m=+0.658587317 container remove dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=silly_mestorf, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:31:12 compute-0 systemd[1]: libpod-conmon-dbbb187861340b4569459a547582f2076869cb837e0f38f19ff2b61146cf83e4.scope: Deactivated successfully.
Dec 03 21:31:12 compute-0 sudo[244756]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:12 compute-0 sudo[244870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:31:12 compute-0 sudo[244870]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:12 compute-0 sudo[244870]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:12 compute-0 sudo[244895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:31:12 compute-0 sudo[244895]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:12 compute-0 podman[244932]: 2025-12-03 21:31:12.878830211 +0000 UTC m=+0.045414297 container create 2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bell, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:31:12 compute-0 systemd[1]: Started libpod-conmon-2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa.scope.
Dec 03 21:31:12 compute-0 podman[244932]: 2025-12-03 21:31:12.858190929 +0000 UTC m=+0.024775005 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:31:12 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:31:13 compute-0 podman[244932]: 2025-12-03 21:31:13.002376401 +0000 UTC m=+0.168960527 container init 2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bell, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:31:13 compute-0 podman[244932]: 2025-12-03 21:31:13.012021858 +0000 UTC m=+0.178605944 container start 2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bell, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:31:13 compute-0 podman[244932]: 2025-12-03 21:31:13.015870871 +0000 UTC m=+0.182454967 container attach 2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:31:13 compute-0 peaceful_bell[244948]: 167 167
Dec 03 21:31:13 compute-0 systemd[1]: libpod-2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa.scope: Deactivated successfully.
Dec 03 21:31:13 compute-0 podman[244932]: 2025-12-03 21:31:13.022758656 +0000 UTC m=+0.189342752 container died 2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bell, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 03 21:31:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-042257b7c2536843195381ea1b5fac8d97d0cc72aaaff3303ace091a2681f908-merged.mount: Deactivated successfully.
Dec 03 21:31:13 compute-0 podman[244932]: 2025-12-03 21:31:13.073594147 +0000 UTC m=+0.240178213 container remove 2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_bell, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2)
Dec 03 21:31:13 compute-0 systemd[1]: libpod-conmon-2716c2d70490038bcb37ec0050be3e695949ab26cb5576cddbdbd4f6606af3fa.scope: Deactivated successfully.
Dec 03 21:31:13 compute-0 ceph-mon[75204]: pgmap v743: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:13 compute-0 podman[244973]: 2025-12-03 21:31:13.280049076 +0000 UTC m=+0.065342411 container create d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_merkle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:31:13 compute-0 systemd[1]: Started libpod-conmon-d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc.scope.
Dec 03 21:31:13 compute-0 podman[244973]: 2025-12-03 21:31:13.252695603 +0000 UTC m=+0.037988978 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:31:13 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:31:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5022fc51579297474b28464f3d1f314812da28be5a57aacc61b91bacf30d180e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5022fc51579297474b28464f3d1f314812da28be5a57aacc61b91bacf30d180e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5022fc51579297474b28464f3d1f314812da28be5a57aacc61b91bacf30d180e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5022fc51579297474b28464f3d1f314812da28be5a57aacc61b91bacf30d180e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:31:13 compute-0 podman[244973]: 2025-12-03 21:31:13.376676053 +0000 UTC m=+0.161969428 container init d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_merkle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:31:13 compute-0 podman[244973]: 2025-12-03 21:31:13.392702743 +0000 UTC m=+0.177996068 container start d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_merkle, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:31:13 compute-0 podman[244973]: 2025-12-03 21:31:13.397018138 +0000 UTC m=+0.182311463 container attach d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_merkle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:31:14 compute-0 lvm[245067]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:31:14 compute-0 lvm[245067]: VG ceph_vg0 finished
Dec 03 21:31:14 compute-0 lvm[245068]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:31:14 compute-0 lvm[245068]: VG ceph_vg1 finished
Dec 03 21:31:14 compute-0 lvm[245070]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:31:14 compute-0 lvm[245070]: VG ceph_vg2 finished
Dec 03 21:31:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v744: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:14 compute-0 compassionate_merkle[244989]: {}
Dec 03 21:31:14 compute-0 systemd[1]: libpod-d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc.scope: Deactivated successfully.
Dec 03 21:31:14 compute-0 podman[244973]: 2025-12-03 21:31:14.230109447 +0000 UTC m=+1.015402782 container died d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_merkle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:31:14 compute-0 systemd[1]: libpod-d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc.scope: Consumed 1.401s CPU time.
Dec 03 21:31:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-5022fc51579297474b28464f3d1f314812da28be5a57aacc61b91bacf30d180e-merged.mount: Deactivated successfully.
Dec 03 21:31:14 compute-0 podman[244973]: 2025-12-03 21:31:14.339004894 +0000 UTC m=+1.124298189 container remove d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=compassionate_merkle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, OSD_FLAVOR=default)
Dec 03 21:31:14 compute-0 systemd[1]: libpod-conmon-d4bd0c6b5396cc1debbad154446dcc876740b7fe29a1901b61329b5c95432dfc.scope: Deactivated successfully.
Dec 03 21:31:14 compute-0 sudo[244895]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:31:14 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:31:14 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:31:14 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:31:14 compute-0 sudo[245087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:31:14 compute-0 sudo[245087]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:31:14 compute-0 sudo[245087]: pam_unix(sudo:session): session closed for user root
Dec 03 21:31:15 compute-0 ceph-mon[75204]: pgmap v744: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:15 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:31:15 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:31:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v745: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:17 compute-0 ceph-mon[75204]: pgmap v745: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v746: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:19 compute-0 ceph-mon[75204]: pgmap v746: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v747: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.068 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.069 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.100 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.100 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.101 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:31:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.170 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.170 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.171 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.171 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.172 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.172 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.172 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:31:21 compute-0 podman[245112]: 2025-12-03 21:31:21.21119157 +0000 UTC m=+0.138037007 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:31:21
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.data', 'images', '.mgr', 'vms', 'volumes', 'cephfs.cephfs.meta', 'backups']
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.587 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.588 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.588 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.589 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:31:21 compute-0 nova_compute[241566]: 2025-12-03 21:31:21.589 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:31:21 compute-0 ceph-mon[75204]: pgmap v747: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:31:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:31:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:31:22 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2042915418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:31:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v748: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:22 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.161 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.572s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:31:22 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.371 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:31:22 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.373 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5289MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:31:22 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.373 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:31:22 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.374 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:31:22 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.466 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:31:22 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.466 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:31:22 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.483 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:31:22 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2042915418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:31:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:31:22 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1619567213' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:31:23 compute-0 nova_compute[241566]: 2025-12-03 21:31:22.999 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:31:23 compute-0 nova_compute[241566]: 2025-12-03 21:31:23.005 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:31:23 compute-0 nova_compute[241566]: 2025-12-03 21:31:23.067 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:31:23 compute-0 nova_compute[241566]: 2025-12-03 21:31:23.069 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:31:23 compute-0 nova_compute[241566]: 2025-12-03 21:31:23.070 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:31:23 compute-0 ceph-mon[75204]: pgmap v748: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:23 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1619567213' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:31:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v749: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:25 compute-0 ceph-mon[75204]: pgmap v749: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v750: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006661126954702844 of space, bias 1.0, pg target 0.19983380864108533 quantized to 32 (current 32)
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.8006447912962508e-06 of space, bias 4.0, pg target 0.002160773749555501 quantized to 16 (current 16)
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:31:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:31:27 compute-0 ceph-mon[75204]: pgmap v750: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #36. Immutable memtables: 0.
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.815821) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 36
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797487815910, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1377, "num_deletes": 251, "total_data_size": 1430002, "memory_usage": 1455984, "flush_reason": "Manual Compaction"}
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #37: started
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797487828912, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 37, "file_size": 1399605, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14336, "largest_seqno": 15712, "table_properties": {"data_size": 1393106, "index_size": 3702, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13562, "raw_average_key_size": 19, "raw_value_size": 1379946, "raw_average_value_size": 2014, "num_data_blocks": 169, "num_entries": 685, "num_filter_entries": 685, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764797351, "oldest_key_time": 1764797351, "file_creation_time": 1764797487, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 13136 microseconds, and 7549 cpu microseconds.
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.828967) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #37: 1399605 bytes OK
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.828990) [db/memtable_list.cc:519] [default] Level-0 commit table #37 started
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.830187) [db/memtable_list.cc:722] [default] Level-0 commit table #37: memtable #1 done
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.830207) EVENT_LOG_v1 {"time_micros": 1764797487830202, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.830229) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1423868, prev total WAL file size 1423868, number of live WAL files 2.
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000033.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.831061) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [37(1366KB)], [35(5495KB)]
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797487831102, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [37], "files_L6": [35], "score": -1, "input_data_size": 7026676, "oldest_snapshot_seqno": -1}
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #38: 3530 keys, 5817987 bytes, temperature: kUnknown
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797487875121, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 38, "file_size": 5817987, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5791426, "index_size": 16659, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8837, "raw_key_size": 83481, "raw_average_key_size": 23, "raw_value_size": 5724939, "raw_average_value_size": 1621, "num_data_blocks": 718, "num_entries": 3530, "num_filter_entries": 3530, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764797487, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.875398) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 5817987 bytes
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.876970) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 159.3 rd, 131.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 5.4 +0.0 blob) out(5.5 +0.0 blob), read-write-amplify(9.2) write-amplify(4.2) OK, records in: 4048, records dropped: 518 output_compression: NoCompression
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.876998) EVENT_LOG_v1 {"time_micros": 1764797487876984, "job": 16, "event": "compaction_finished", "compaction_time_micros": 44107, "compaction_time_cpu_micros": 16663, "output_level": 6, "num_output_files": 1, "total_output_size": 5817987, "num_input_records": 4048, "num_output_records": 3530, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000037.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797487877534, "job": 16, "event": "table_file_deletion", "file_number": 37}
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797487879215, "job": 16, "event": "table_file_deletion", "file_number": 35}
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.830907) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.879321) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.879328) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.879331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.879334) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:31:27 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:31:27.879337) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:31:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v751: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:29 compute-0 ceph-mon[75204]: pgmap v751: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v752: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:31 compute-0 ceph-mon[75204]: pgmap v752: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v753: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e67 do_prune osdmap full prune enabled
Dec 03 21:31:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e68 e68: 3 total, 3 up, 3 in
Dec 03 21:31:32 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e68: 3 total, 3 up, 3 in
Dec 03 21:31:33 compute-0 ceph-mon[75204]: pgmap v753: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:33 compute-0 ceph-mon[75204]: osdmap e68: 3 total, 3 up, 3 in
Dec 03 21:31:34 compute-0 podman[245183]: 2025-12-03 21:31:34.120591275 +0000 UTC m=+0.055303993 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 03 21:31:34 compute-0 podman[245184]: 2025-12-03 21:31:34.126332768 +0000 UTC m=+0.056846923 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 03 21:31:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v755: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:35 compute-0 ceph-mon[75204]: pgmap v755: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:31:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v756: 177 pgs: 177 active+clean; 49 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 821 KiB/s wr, 21 op/s
Dec 03 21:31:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e68 do_prune osdmap full prune enabled
Dec 03 21:31:37 compute-0 ceph-mon[75204]: pgmap v756: 177 pgs: 177 active+clean; 49 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 821 KiB/s wr, 21 op/s
Dec 03 21:31:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e69 e69: 3 total, 3 up, 3 in
Dec 03 21:31:37 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e69: 3 total, 3 up, 3 in
Dec 03 21:31:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v758: 177 pgs: 177 active+clean; 49 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.0 MiB/s wr, 27 op/s
Dec 03 21:31:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e69 do_prune osdmap full prune enabled
Dec 03 21:31:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e70 e70: 3 total, 3 up, 3 in
Dec 03 21:31:38 compute-0 ceph-mon[75204]: osdmap e69: 3 total, 3 up, 3 in
Dec 03 21:31:38 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e70: 3 total, 3 up, 3 in
Dec 03 21:31:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e70 do_prune osdmap full prune enabled
Dec 03 21:31:39 compute-0 ceph-mon[75204]: pgmap v758: 177 pgs: 177 active+clean; 49 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.0 MiB/s wr, 27 op/s
Dec 03 21:31:39 compute-0 ceph-mon[75204]: osdmap e70: 3 total, 3 up, 3 in
Dec 03 21:31:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e71 e71: 3 total, 3 up, 3 in
Dec 03 21:31:39 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e71: 3 total, 3 up, 3 in
Dec 03 21:31:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v761: 177 pgs: 177 active+clean; 49 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 1.3 MiB/s wr, 36 op/s
Dec 03 21:31:40 compute-0 ceph-mon[75204]: osdmap e71: 3 total, 3 up, 3 in
Dec 03 21:31:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:41 compute-0 ceph-mon[75204]: pgmap v761: 177 pgs: 177 active+clean; 49 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 1.3 MiB/s wr, 36 op/s
Dec 03 21:31:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v762: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 106 KiB/s rd, 17 MiB/s wr, 154 op/s
Dec 03 21:31:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e71 do_prune osdmap full prune enabled
Dec 03 21:31:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e72 e72: 3 total, 3 up, 3 in
Dec 03 21:31:42 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e72: 3 total, 3 up, 3 in
Dec 03 21:31:43 compute-0 ceph-mon[75204]: pgmap v762: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 106 KiB/s rd, 17 MiB/s wr, 154 op/s
Dec 03 21:31:43 compute-0 ceph-mon[75204]: osdmap e72: 3 total, 3 up, 3 in
Dec 03 21:31:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v764: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 106 KiB/s rd, 17 MiB/s wr, 154 op/s
Dec 03 21:31:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e72 do_prune osdmap full prune enabled
Dec 03 21:31:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e73 e73: 3 total, 3 up, 3 in
Dec 03 21:31:44 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e73: 3 total, 3 up, 3 in
Dec 03 21:31:45 compute-0 ceph-mon[75204]: pgmap v764: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 106 KiB/s rd, 17 MiB/s wr, 154 op/s
Dec 03 21:31:45 compute-0 ceph-mon[75204]: osdmap e73: 3 total, 3 up, 3 in
Dec 03 21:31:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e73 do_prune osdmap full prune enabled
Dec 03 21:31:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e74 e74: 3 total, 3 up, 3 in
Dec 03 21:31:45 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e74: 3 total, 3 up, 3 in
Dec 03 21:31:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e74 do_prune osdmap full prune enabled
Dec 03 21:31:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e75 e75: 3 total, 3 up, 3 in
Dec 03 21:31:46 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e75: 3 total, 3 up, 3 in
Dec 03 21:31:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v768: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 135 KiB/s rd, 12 KiB/s wr, 188 op/s
Dec 03 21:31:46 compute-0 ceph-mon[75204]: osdmap e74: 3 total, 3 up, 3 in
Dec 03 21:31:46 compute-0 ceph-mon[75204]: osdmap e75: 3 total, 3 up, 3 in
Dec 03 21:31:47 compute-0 ceph-mon[75204]: pgmap v768: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 135 KiB/s rd, 12 KiB/s wr, 188 op/s
Dec 03 21:31:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v769: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 103 KiB/s rd, 9.4 KiB/s wr, 144 op/s
Dec 03 21:31:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:31:48.935 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:31:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:31:48.936 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:31:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:31:48.936 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:31:49 compute-0 ceph-mon[75204]: pgmap v769: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 103 KiB/s rd, 9.4 KiB/s wr, 144 op/s
Dec 03 21:31:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v770: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 90 KiB/s rd, 8.2 KiB/s wr, 125 op/s
Dec 03 21:31:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e75 do_prune osdmap full prune enabled
Dec 03 21:31:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e76 e76: 3 total, 3 up, 3 in
Dec 03 21:31:50 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e76: 3 total, 3 up, 3 in
Dec 03 21:31:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e76 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:31:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:31:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:31:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:31:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:31:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:31:51 compute-0 ceph-mon[75204]: pgmap v770: 177 pgs: 6 active+clean+snaptrim_wait, 4 active+clean+snaptrim, 167 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 90 KiB/s rd, 8.2 KiB/s wr, 125 op/s
Dec 03 21:31:51 compute-0 ceph-mon[75204]: osdmap e76: 3 total, 3 up, 3 in
Dec 03 21:31:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v772: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 118 KiB/s rd, 10 KiB/s wr, 159 op/s
Dec 03 21:31:52 compute-0 podman[245219]: 2025-12-03 21:31:52.201376053 +0000 UTC m=+0.131203695 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 03 21:31:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e76 do_prune osdmap full prune enabled
Dec 03 21:31:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e77 e77: 3 total, 3 up, 3 in
Dec 03 21:31:52 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e77: 3 total, 3 up, 3 in
Dec 03 21:31:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e77 do_prune osdmap full prune enabled
Dec 03 21:31:53 compute-0 ceph-mon[75204]: pgmap v772: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 118 KiB/s rd, 10 KiB/s wr, 159 op/s
Dec 03 21:31:53 compute-0 ceph-mon[75204]: osdmap e77: 3 total, 3 up, 3 in
Dec 03 21:31:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e78 e78: 3 total, 3 up, 3 in
Dec 03 21:31:54 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e78: 3 total, 3 up, 3 in
Dec 03 21:31:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v775: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 68 KiB/s rd, 6.5 KiB/s wr, 92 op/s
Dec 03 21:31:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e78 do_prune osdmap full prune enabled
Dec 03 21:31:55 compute-0 ceph-mon[75204]: osdmap e78: 3 total, 3 up, 3 in
Dec 03 21:31:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e79 e79: 3 total, 3 up, 3 in
Dec 03 21:31:55 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e79: 3 total, 3 up, 3 in
Dec 03 21:31:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e79 do_prune osdmap full prune enabled
Dec 03 21:31:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e80 e80: 3 total, 3 up, 3 in
Dec 03 21:31:56 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e80: 3 total, 3 up, 3 in
Dec 03 21:31:56 compute-0 ceph-mon[75204]: pgmap v775: 177 pgs: 177 active+clean; 41 MiB data, 122 MiB used, 60 GiB / 60 GiB avail; 68 KiB/s rd, 6.5 KiB/s wr, 92 op/s
Dec 03 21:31:56 compute-0 ceph-mon[75204]: osdmap e79: 3 total, 3 up, 3 in
Dec 03 21:31:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:31:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e80 do_prune osdmap full prune enabled
Dec 03 21:31:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e81 e81: 3 total, 3 up, 3 in
Dec 03 21:31:56 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e81: 3 total, 3 up, 3 in
Dec 03 21:31:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v779: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 109 KiB/s rd, 19 KiB/s wr, 157 op/s
Dec 03 21:31:57 compute-0 ceph-mon[75204]: osdmap e80: 3 total, 3 up, 3 in
Dec 03 21:31:57 compute-0 ceph-mon[75204]: osdmap e81: 3 total, 3 up, 3 in
Dec 03 21:31:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e81 do_prune osdmap full prune enabled
Dec 03 21:31:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e82 e82: 3 total, 3 up, 3 in
Dec 03 21:31:57 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e82: 3 total, 3 up, 3 in
Dec 03 21:31:58 compute-0 ceph-mon[75204]: pgmap v779: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 109 KiB/s rd, 19 KiB/s wr, 157 op/s
Dec 03 21:31:58 compute-0 ceph-mon[75204]: osdmap e82: 3 total, 3 up, 3 in
Dec 03 21:31:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v781: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 87 KiB/s rd, 15 KiB/s wr, 124 op/s
Dec 03 21:31:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:31:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1121350231' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:31:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:31:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1121350231' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:32:00 compute-0 ceph-mon[75204]: pgmap v781: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 87 KiB/s rd, 15 KiB/s wr, 124 op/s
Dec 03 21:32:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1121350231' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:32:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1121350231' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:32:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v782: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 67 KiB/s rd, 11 KiB/s wr, 96 op/s
Dec 03 21:32:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e82 do_prune osdmap full prune enabled
Dec 03 21:32:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e83 e83: 3 total, 3 up, 3 in
Dec 03 21:32:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e83: 3 total, 3 up, 3 in
Dec 03 21:32:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e83 do_prune osdmap full prune enabled
Dec 03 21:32:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e84 e84: 3 total, 3 up, 3 in
Dec 03 21:32:02 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e84: 3 total, 3 up, 3 in
Dec 03 21:32:02 compute-0 ceph-mon[75204]: pgmap v782: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 67 KiB/s rd, 11 KiB/s wr, 96 op/s
Dec 03 21:32:02 compute-0 ceph-mon[75204]: osdmap e83: 3 total, 3 up, 3 in
Dec 03 21:32:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v785: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 5.7 KiB/s wr, 51 op/s
Dec 03 21:32:03 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e84 do_prune osdmap full prune enabled
Dec 03 21:32:03 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e85 e85: 3 total, 3 up, 3 in
Dec 03 21:32:03 compute-0 ceph-mon[75204]: osdmap e84: 3 total, 3 up, 3 in
Dec 03 21:32:03 compute-0 ceph-mon[75204]: pgmap v785: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 5.7 KiB/s wr, 51 op/s
Dec 03 21:32:03 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e85: 3 total, 3 up, 3 in
Dec 03 21:32:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e85 do_prune osdmap full prune enabled
Dec 03 21:32:04 compute-0 ceph-mon[75204]: osdmap e85: 3 total, 3 up, 3 in
Dec 03 21:32:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e86 e86: 3 total, 3 up, 3 in
Dec 03 21:32:04 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e86: 3 total, 3 up, 3 in
Dec 03 21:32:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v788: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 8.5 KiB/s wr, 77 op/s
Dec 03 21:32:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e86 do_prune osdmap full prune enabled
Dec 03 21:32:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e87 e87: 3 total, 3 up, 3 in
Dec 03 21:32:05 compute-0 ceph-mon[75204]: osdmap e86: 3 total, 3 up, 3 in
Dec 03 21:32:05 compute-0 ceph-mon[75204]: pgmap v788: 177 pgs: 177 active+clean; 41 MiB data, 140 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 8.5 KiB/s wr, 77 op/s
Dec 03 21:32:05 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e87: 3 total, 3 up, 3 in
Dec 03 21:32:05 compute-0 podman[245246]: 2025-12-03 21:32:05.192005222 +0000 UTC m=+0.113381397 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:32:05 compute-0 podman[245245]: 2025-12-03 21:32:05.214930347 +0000 UTC m=+0.141316656 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:32:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e87 do_prune osdmap full prune enabled
Dec 03 21:32:06 compute-0 ceph-mon[75204]: osdmap e87: 3 total, 3 up, 3 in
Dec 03 21:32:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e88 e88: 3 total, 3 up, 3 in
Dec 03 21:32:06 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e88: 3 total, 3 up, 3 in
Dec 03 21:32:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v791: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 41 MiB data, 141 MiB used, 60 GiB / 60 GiB avail; 221 KiB/s rd, 16 KiB/s wr, 302 op/s
Dec 03 21:32:07 compute-0 ceph-mon[75204]: osdmap e88: 3 total, 3 up, 3 in
Dec 03 21:32:07 compute-0 ceph-mon[75204]: pgmap v791: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 41 MiB data, 141 MiB used, 60 GiB / 60 GiB avail; 221 KiB/s rd, 16 KiB/s wr, 302 op/s
Dec 03 21:32:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v792: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 41 MiB data, 141 MiB used, 60 GiB / 60 GiB avail; 174 KiB/s rd, 13 KiB/s wr, 238 op/s
Dec 03 21:32:09 compute-0 ceph-mon[75204]: pgmap v792: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 41 MiB data, 141 MiB used, 60 GiB / 60 GiB avail; 174 KiB/s rd, 13 KiB/s wr, 238 op/s
Dec 03 21:32:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v793: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 41 MiB data, 141 MiB used, 60 GiB / 60 GiB avail; 146 KiB/s rd, 11 KiB/s wr, 199 op/s
Dec 03 21:32:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e88 do_prune osdmap full prune enabled
Dec 03 21:32:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e89 e89: 3 total, 3 up, 3 in
Dec 03 21:32:11 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e89: 3 total, 3 up, 3 in
Dec 03 21:32:11 compute-0 ceph-mon[75204]: pgmap v793: 177 pgs: 2 active+clean+snaptrim, 3 active+clean+snaptrim_wait, 172 active+clean; 41 MiB data, 141 MiB used, 60 GiB / 60 GiB avail; 146 KiB/s rd, 11 KiB/s wr, 199 op/s
Dec 03 21:32:11 compute-0 ceph-mon[75204]: osdmap e89: 3 total, 3 up, 3 in
Dec 03 21:32:12 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:32:12.175 151937 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '4a:b3:fa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:85:3a:67:f5:74'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 03 21:32:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v795: 177 pgs: 177 active+clean; 41 MiB data, 158 MiB used, 60 GiB / 60 GiB avail; 153 KiB/s rd, 12 KiB/s wr, 210 op/s
Dec 03 21:32:12 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:32:12.177 151937 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 03 21:32:12 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:32:12.179 151937 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f27c01e7-5b62-4209-a664-3ae50b74644d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 21:32:13 compute-0 ceph-mon[75204]: pgmap v795: 177 pgs: 177 active+clean; 41 MiB data, 158 MiB used, 60 GiB / 60 GiB avail; 153 KiB/s rd, 12 KiB/s wr, 210 op/s
Dec 03 21:32:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v796: 177 pgs: 177 active+clean; 41 MiB data, 158 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 2.2 KiB/s wr, 34 op/s
Dec 03 21:32:14 compute-0 sudo[245286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:32:14 compute-0 sudo[245286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:14 compute-0 sudo[245286]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:14 compute-0 sudo[245311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:32:14 compute-0 sudo[245311]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:15 compute-0 ceph-mon[75204]: pgmap v796: 177 pgs: 177 active+clean; 41 MiB data, 158 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 2.2 KiB/s wr, 34 op/s
Dec 03 21:32:15 compute-0 podman[245380]: 2025-12-03 21:32:15.266916855 +0000 UTC m=+0.089229790 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:32:15 compute-0 podman[245380]: 2025-12-03 21:32:15.380114326 +0000 UTC m=+0.202427221 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:32:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e89 do_prune osdmap full prune enabled
Dec 03 21:32:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e90 e90: 3 total, 3 up, 3 in
Dec 03 21:32:16 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e90: 3 total, 3 up, 3 in
Dec 03 21:32:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v798: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 3.6 KiB/s wr, 46 op/s
Dec 03 21:32:16 compute-0 sudo[245311]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:32:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:32:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:16 compute-0 sudo[245547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:32:16 compute-0 sudo[245547]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:16 compute-0 sudo[245547]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:16 compute-0 sudo[245572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:32:16 compute-0 sudo[245572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:16 compute-0 nova_compute[241566]: 2025-12-03 21:32:16.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:16 compute-0 nova_compute[241566]: 2025-12-03 21:32:16.553 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 03 21:32:16 compute-0 nova_compute[241566]: 2025-12-03 21:32:16.567 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 03 21:32:16 compute-0 nova_compute[241566]: 2025-12-03 21:32:16.568 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:16 compute-0 nova_compute[241566]: 2025-12-03 21:32:16.569 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 03 21:32:16 compute-0 nova_compute[241566]: 2025-12-03 21:32:16.580 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:17 compute-0 ceph-mon[75204]: osdmap e90: 3 total, 3 up, 3 in
Dec 03 21:32:17 compute-0 ceph-mon[75204]: pgmap v798: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 3.6 KiB/s wr, 46 op/s
Dec 03 21:32:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:17 compute-0 sudo[245572]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:32:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:32:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:32:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:32:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:32:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:32:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:32:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:32:17 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:32:17 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:32:17 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:32:17 compute-0 sudo[245629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:32:17 compute-0 sudo[245629]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:17 compute-0 sudo[245629]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:17 compute-0 sudo[245654]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:32:17 compute-0 sudo[245654]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:17 compute-0 podman[245691]: 2025-12-03 21:32:17.785376666 +0000 UTC m=+0.065078273 container create 6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_lalande, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:32:17 compute-0 systemd[1]: Started libpod-conmon-6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4.scope.
Dec 03 21:32:17 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:32:17 compute-0 podman[245691]: 2025-12-03 21:32:17.759001379 +0000 UTC m=+0.038703046 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:32:17 compute-0 podman[245691]: 2025-12-03 21:32:17.871144273 +0000 UTC m=+0.150845950 container init 6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_lalande, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 03 21:32:17 compute-0 podman[245691]: 2025-12-03 21:32:17.879248479 +0000 UTC m=+0.158950097 container start 6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_lalande, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:32:17 compute-0 podman[245691]: 2025-12-03 21:32:17.883008381 +0000 UTC m=+0.162709998 container attach 6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_lalande, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2)
Dec 03 21:32:17 compute-0 ecstatic_lalande[245708]: 167 167
Dec 03 21:32:17 compute-0 systemd[1]: libpod-6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4.scope: Deactivated successfully.
Dec 03 21:32:17 compute-0 conmon[245708]: conmon 6795932662b1b6635465 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4.scope/container/memory.events
Dec 03 21:32:17 compute-0 podman[245691]: 2025-12-03 21:32:17.887044778 +0000 UTC m=+0.166746395 container died 6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_lalande, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:32:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-4b62ab10f2b7cfee39bdb7e841b273aa26926990870153e2ce1ca765ee7a3cdf-merged.mount: Deactivated successfully.
Dec 03 21:32:17 compute-0 podman[245691]: 2025-12-03 21:32:17.941204119 +0000 UTC m=+0.220905696 container remove 6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=ecstatic_lalande, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:32:17 compute-0 systemd[1]: libpod-conmon-6795932662b1b6635465de63b60b8271ebdeb997a3e309f4aabda20089bdb8f4.scope: Deactivated successfully.
Dec 03 21:32:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:32:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:32:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:32:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:32:18 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:32:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v799: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 3.6 KiB/s wr, 46 op/s
Dec 03 21:32:18 compute-0 podman[245732]: 2025-12-03 21:32:18.189768725 +0000 UTC m=+0.056696689 container create 91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bassi, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:32:18 compute-0 systemd[1]: Started libpod-conmon-91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4.scope.
Dec 03 21:32:18 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ad5855244fc81a1f9e5d9daae6f4a911ee79b229f376b15e559c4d841220c80/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:18 compute-0 podman[245732]: 2025-12-03 21:32:18.172352469 +0000 UTC m=+0.039280433 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ad5855244fc81a1f9e5d9daae6f4a911ee79b229f376b15e559c4d841220c80/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ad5855244fc81a1f9e5d9daae6f4a911ee79b229f376b15e559c4d841220c80/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ad5855244fc81a1f9e5d9daae6f4a911ee79b229f376b15e559c4d841220c80/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ad5855244fc81a1f9e5d9daae6f4a911ee79b229f376b15e559c4d841220c80/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:18 compute-0 podman[245732]: 2025-12-03 21:32:18.287464671 +0000 UTC m=+0.154392655 container init 91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bassi, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:32:18 compute-0 podman[245732]: 2025-12-03 21:32:18.298444085 +0000 UTC m=+0.165372029 container start 91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bassi, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:32:18 compute-0 podman[245732]: 2025-12-03 21:32:18.302167165 +0000 UTC m=+0.169095159 container attach 91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bassi, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:32:18 compute-0 nova_compute[241566]: 2025-12-03 21:32:18.582 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:18 compute-0 cool_bassi[245748]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:32:18 compute-0 cool_bassi[245748]: --> All data devices are unavailable
Dec 03 21:32:18 compute-0 systemd[1]: libpod-91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4.scope: Deactivated successfully.
Dec 03 21:32:18 compute-0 podman[245768]: 2025-12-03 21:32:18.921438658 +0000 UTC m=+0.030020434 container died 91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bassi, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:32:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-8ad5855244fc81a1f9e5d9daae6f4a911ee79b229f376b15e559c4d841220c80-merged.mount: Deactivated successfully.
Dec 03 21:32:18 compute-0 podman[245768]: 2025-12-03 21:32:18.967496772 +0000 UTC m=+0.076078448 container remove 91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_bassi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:32:18 compute-0 systemd[1]: libpod-conmon-91d0740a84b35eec3d8e9d46f1de178dfb82b0dbdc35ce46961120f6af5c39b4.scope: Deactivated successfully.
Dec 03 21:32:19 compute-0 sudo[245654]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:19 compute-0 sudo[245783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:32:19 compute-0 sudo[245783]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:19 compute-0 sudo[245783]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:19 compute-0 sudo[245808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:32:19 compute-0 sudo[245808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:19 compute-0 ceph-mon[75204]: pgmap v799: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 3.6 KiB/s wr, 46 op/s
Dec 03 21:32:19 compute-0 podman[245846]: 2025-12-03 21:32:19.498113081 +0000 UTC m=+0.050175735 container create d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_kapitsa, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:32:19 compute-0 systemd[1]: Started libpod-conmon-d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c.scope.
Dec 03 21:32:19 compute-0 nova_compute[241566]: 2025-12-03 21:32:19.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:19 compute-0 nova_compute[241566]: 2025-12-03 21:32:19.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:19 compute-0 nova_compute[241566]: 2025-12-03 21:32:19.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:32:19 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:32:19 compute-0 podman[245846]: 2025-12-03 21:32:19.476269536 +0000 UTC m=+0.028332190 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:32:19 compute-0 podman[245846]: 2025-12-03 21:32:19.583137108 +0000 UTC m=+0.135199762 container init d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_kapitsa, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:32:19 compute-0 podman[245846]: 2025-12-03 21:32:19.594830221 +0000 UTC m=+0.146892845 container start d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_kapitsa, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 03 21:32:19 compute-0 podman[245846]: 2025-12-03 21:32:19.5981539 +0000 UTC m=+0.150216554 container attach d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_kapitsa, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 03 21:32:19 compute-0 infallible_kapitsa[245862]: 167 167
Dec 03 21:32:19 compute-0 systemd[1]: libpod-d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c.scope: Deactivated successfully.
Dec 03 21:32:19 compute-0 conmon[245862]: conmon d2fed066020d5eaa6beb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c.scope/container/memory.events
Dec 03 21:32:19 compute-0 podman[245846]: 2025-12-03 21:32:19.600339509 +0000 UTC m=+0.152402163 container died d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_kapitsa, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.vendor=CentOS)
Dec 03 21:32:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-83d8fe3b02d6e340e19526364f565cb8f215d725cc8e8f969e514759f451cb74-merged.mount: Deactivated successfully.
Dec 03 21:32:19 compute-0 podman[245846]: 2025-12-03 21:32:19.645345934 +0000 UTC m=+0.197408588 container remove d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_kapitsa, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:32:19 compute-0 systemd[1]: libpod-conmon-d2fed066020d5eaa6beb940ee4e6aac1b57f16e0b68d80f37f918e1c4626860c.scope: Deactivated successfully.
Dec 03 21:32:19 compute-0 podman[245885]: 2025-12-03 21:32:19.859133948 +0000 UTC m=+0.066389249 container create ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_easley, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:32:19 compute-0 systemd[1]: Started libpod-conmon-ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc.scope.
Dec 03 21:32:19 compute-0 podman[245885]: 2025-12-03 21:32:19.834851648 +0000 UTC m=+0.042106979 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:32:19 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:32:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06b4fd392b550d2ed9112c9cdfbbbbb9ad63bf3c806ac7531b3c51e0bbe25293/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06b4fd392b550d2ed9112c9cdfbbbbb9ad63bf3c806ac7531b3c51e0bbe25293/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06b4fd392b550d2ed9112c9cdfbbbbb9ad63bf3c806ac7531b3c51e0bbe25293/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06b4fd392b550d2ed9112c9cdfbbbbb9ad63bf3c806ac7531b3c51e0bbe25293/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:19 compute-0 podman[245885]: 2025-12-03 21:32:19.970194143 +0000 UTC m=+0.177449504 container init ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_easley, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:32:19 compute-0 podman[245885]: 2025-12-03 21:32:19.98351557 +0000 UTC m=+0.190770841 container start ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_easley, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:32:19 compute-0 podman[245885]: 2025-12-03 21:32:19.987005623 +0000 UTC m=+0.194260904 container attach ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_easley, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:32:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v800: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 7.3 KiB/s rd, 1.2 KiB/s wr, 10 op/s
Dec 03 21:32:20 compute-0 affectionate_easley[245902]: {
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:     "0": [
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:         {
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "devices": [
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "/dev/loop3"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             ],
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_name": "ceph_lv0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_size": "21470642176",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "name": "ceph_lv0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "tags": {
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cluster_name": "ceph",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.crush_device_class": "",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.encrypted": "0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.objectstore": "bluestore",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osd_id": "0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.type": "block",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.vdo": "0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.with_tpm": "0"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             },
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "type": "block",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "vg_name": "ceph_vg0"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:         }
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:     ],
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:     "1": [
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:         {
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "devices": [
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "/dev/loop4"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             ],
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_name": "ceph_lv1",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_size": "21470642176",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "name": "ceph_lv1",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "tags": {
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cluster_name": "ceph",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.crush_device_class": "",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.encrypted": "0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.objectstore": "bluestore",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osd_id": "1",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.type": "block",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.vdo": "0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.with_tpm": "0"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             },
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "type": "block",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "vg_name": "ceph_vg1"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:         }
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:     ],
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:     "2": [
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:         {
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "devices": [
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "/dev/loop5"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             ],
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_name": "ceph_lv2",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_size": "21470642176",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "name": "ceph_lv2",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "tags": {
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.cluster_name": "ceph",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.crush_device_class": "",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.encrypted": "0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.objectstore": "bluestore",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osd_id": "2",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.type": "block",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.vdo": "0",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:                 "ceph.with_tpm": "0"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             },
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "type": "block",
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:             "vg_name": "ceph_vg2"
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:         }
Dec 03 21:32:20 compute-0 affectionate_easley[245902]:     ]
Dec 03 21:32:20 compute-0 affectionate_easley[245902]: }
Dec 03 21:32:20 compute-0 systemd[1]: libpod-ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc.scope: Deactivated successfully.
Dec 03 21:32:20 compute-0 podman[245885]: 2025-12-03 21:32:20.316548917 +0000 UTC m=+0.523804228 container died ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_easley, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 03 21:32:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-06b4fd392b550d2ed9112c9cdfbbbbb9ad63bf3c806ac7531b3c51e0bbe25293-merged.mount: Deactivated successfully.
Dec 03 21:32:20 compute-0 podman[245885]: 2025-12-03 21:32:20.36891486 +0000 UTC m=+0.576170191 container remove ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_easley, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:32:20 compute-0 systemd[1]: libpod-conmon-ae5e23684d57df630a78d669321a4cd7698c6ff9a7ea250fa0eda57f8187cbcc.scope: Deactivated successfully.
Dec 03 21:32:20 compute-0 sudo[245808]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:20 compute-0 sudo[245921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:32:20 compute-0 sudo[245921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:20 compute-0 sudo[245921]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:20 compute-0 nova_compute[241566]: 2025-12-03 21:32:20.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:20 compute-0 nova_compute[241566]: 2025-12-03 21:32:20.553 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:32:20 compute-0 nova_compute[241566]: 2025-12-03 21:32:20.553 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:32:20 compute-0 nova_compute[241566]: 2025-12-03 21:32:20.569 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:32:20 compute-0 nova_compute[241566]: 2025-12-03 21:32:20.569 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:20 compute-0 sudo[245946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:32:20 compute-0 sudo[245946]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:20 compute-0 podman[245985]: 2025-12-03 21:32:20.973252383 +0000 UTC m=+0.074335802 container create c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_germain, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:32:21 compute-0 systemd[1]: Started libpod-conmon-c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a.scope.
Dec 03 21:32:21 compute-0 podman[245985]: 2025-12-03 21:32:20.938647157 +0000 UTC m=+0.039730636 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:32:21 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:32:21 compute-0 podman[245985]: 2025-12-03 21:32:21.085789307 +0000 UTC m=+0.186872786 container init c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_germain, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, ceph=True)
Dec 03 21:32:21 compute-0 podman[245985]: 2025-12-03 21:32:21.095499937 +0000 UTC m=+0.196583346 container start c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_germain, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:32:21 compute-0 podman[245985]: 2025-12-03 21:32:21.099463393 +0000 UTC m=+0.200546862 container attach c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_germain, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:32:21 compute-0 exciting_germain[246001]: 167 167
Dec 03 21:32:21 compute-0 systemd[1]: libpod-c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a.scope: Deactivated successfully.
Dec 03 21:32:21 compute-0 podman[245985]: 2025-12-03 21:32:21.100454749 +0000 UTC m=+0.201538128 container died c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_germain, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3)
Dec 03 21:32:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-6e18369da5eb15e1b4bf6e6dc8e37b6b6ceb285cc389b14c3afa6fc1c36df77b-merged.mount: Deactivated successfully.
Dec 03 21:32:21 compute-0 podman[245985]: 2025-12-03 21:32:21.140603005 +0000 UTC m=+0.241686364 container remove c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=exciting_germain, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:32:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:21 compute-0 systemd[1]: libpod-conmon-c9d8df0660a64aded10681a0d05fefbed1c840244a55acc597e035aa02484d3a.scope: Deactivated successfully.
Dec 03 21:32:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e90 do_prune osdmap full prune enabled
Dec 03 21:32:21 compute-0 ceph-mon[75204]: pgmap v800: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 7.3 KiB/s rd, 1.2 KiB/s wr, 10 op/s
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:32:21
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['backups', 'volumes', 'cephfs.cephfs.data', 'vms', 'cephfs.cephfs.meta', 'images', '.mgr']
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:32:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e91 e91: 3 total, 3 up, 3 in
Dec 03 21:32:21 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e91: 3 total, 3 up, 3 in
Dec 03 21:32:21 compute-0 podman[246025]: 2025-12-03 21:32:21.401696037 +0000 UTC m=+0.073328005 container create 7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_nobel, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:32:21 compute-0 systemd[1]: Started libpod-conmon-7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382.scope.
Dec 03 21:32:21 compute-0 podman[246025]: 2025-12-03 21:32:21.372874035 +0000 UTC m=+0.044506053 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:32:21 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:32:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64d0465c66398b3631c6efe3b419d7929d64ce32e71fe5147d85d0528e22a331/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64d0465c66398b3631c6efe3b419d7929d64ce32e71fe5147d85d0528e22a331/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64d0465c66398b3631c6efe3b419d7929d64ce32e71fe5147d85d0528e22a331/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64d0465c66398b3631c6efe3b419d7929d64ce32e71fe5147d85d0528e22a331/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:32:21 compute-0 podman[246025]: 2025-12-03 21:32:21.528303797 +0000 UTC m=+0.199935825 container init 7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_nobel, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:32:21 compute-0 podman[246025]: 2025-12-03 21:32:21.539694882 +0000 UTC m=+0.211326850 container start 7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_nobel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:32:21 compute-0 podman[246025]: 2025-12-03 21:32:21.544015188 +0000 UTC m=+0.215647206 container attach 7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_nobel, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.553 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.553 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.583 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.584 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.585 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.585 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:32:21 compute-0 nova_compute[241566]: 2025-12-03 21:32:21.586 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:32:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:32:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:32:22 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1052663395' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.152 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.566s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v802: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.0 KiB/s wr, 27 op/s
Dec 03 21:32:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e91 do_prune osdmap full prune enabled
Dec 03 21:32:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e92 e92: 3 total, 3 up, 3 in
Dec 03 21:32:22 compute-0 ceph-mon[75204]: osdmap e91: 3 total, 3 up, 3 in
Dec 03 21:32:22 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1052663395' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:32:22 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e92: 3 total, 3 up, 3 in
Dec 03 21:32:22 compute-0 lvm[246152]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:32:22 compute-0 lvm[246153]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:32:22 compute-0 lvm[246153]: VG ceph_vg1 finished
Dec 03 21:32:22 compute-0 lvm[246152]: VG ceph_vg2 finished
Dec 03 21:32:22 compute-0 lvm[246150]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:32:22 compute-0 lvm[246150]: VG ceph_vg0 finished
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.341 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.342 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5153MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.343 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.343 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:22 compute-0 nostalgic_nobel[246042]: {}
Dec 03 21:32:22 compute-0 podman[246139]: 2025-12-03 21:32:22.40748474 +0000 UTC m=+0.124626439 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:32:22 compute-0 systemd[1]: libpod-7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382.scope: Deactivated successfully.
Dec 03 21:32:22 compute-0 systemd[1]: libpod-7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382.scope: Consumed 1.363s CPU time.
Dec 03 21:32:22 compute-0 podman[246173]: 2025-12-03 21:32:22.460826659 +0000 UTC m=+0.031971248 container died 7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_nobel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:32:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-64d0465c66398b3631c6efe3b419d7929d64ce32e71fe5147d85d0528e22a331-merged.mount: Deactivated successfully.
Dec 03 21:32:22 compute-0 podman[246173]: 2025-12-03 21:32:22.498256861 +0000 UTC m=+0.069401430 container remove 7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_nobel, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:32:22 compute-0 systemd[1]: libpod-conmon-7578836eca05fb9a56cea0d847e36c1a29ea04d6c2a7feb40c80390b4f044382.scope: Deactivated successfully.
Dec 03 21:32:22 compute-0 sudo[245946]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:32:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:22 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:32:22 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.576 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.577 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:32:22 compute-0 sudo[246189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:32:22 compute-0 sudo[246189]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:32:22 compute-0 sudo[246189]: pam_unix(sudo:session): session closed for user root
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.640 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing inventories for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.718 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Updating ProviderTree inventory for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.719 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Updating inventory in ProviderTree for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.736 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing aggregate associations for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.767 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing trait associations for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775, traits: HW_CPU_X86_SSE4A,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AESNI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_F16C,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 03 21:32:22 compute-0 nova_compute[241566]: 2025-12-03 21:32:22.781 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:32:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3524126243' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:32:23 compute-0 ceph-mon[75204]: pgmap v802: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 2.0 KiB/s wr, 27 op/s
Dec 03 21:32:23 compute-0 ceph-mon[75204]: osdmap e92: 3 total, 3 up, 3 in
Dec 03 21:32:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:23 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:32:23 compute-0 nova_compute[241566]: 2025-12-03 21:32:23.462 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.681s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:23 compute-0 nova_compute[241566]: 2025-12-03 21:32:23.471 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:32:23 compute-0 nova_compute[241566]: 2025-12-03 21:32:23.491 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:32:23 compute-0 nova_compute[241566]: 2025-12-03 21:32:23.492 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:32:23 compute-0 nova_compute[241566]: 2025-12-03 21:32:23.493 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v804: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 639 B/s wr, 15 op/s
Dec 03 21:32:24 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3524126243' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:32:25 compute-0 ceph-mon[75204]: pgmap v804: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 639 B/s wr, 15 op/s
Dec 03 21:32:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v805: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 2.5 KiB/s wr, 37 op/s
Dec 03 21:32:27 compute-0 nova_compute[241566]: 2025-12-03 21:32:27.332 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "ab23bcbe-2091-4277-8f17-e9554b017c36" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:27 compute-0 nova_compute[241566]: 2025-12-03 21:32:27.333 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "ab23bcbe-2091-4277-8f17-e9554b017c36" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:27 compute-0 nova_compute[241566]: 2025-12-03 21:32:27.367 241570 DEBUG nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 03 21:32:27 compute-0 ceph-mon[75204]: pgmap v805: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 2.5 KiB/s wr, 37 op/s
Dec 03 21:32:27 compute-0 nova_compute[241566]: 2025-12-03 21:32:27.499 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:27 compute-0 nova_compute[241566]: 2025-12-03 21:32:27.500 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:27 compute-0 nova_compute[241566]: 2025-12-03 21:32:27.508 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 03 21:32:27 compute-0 nova_compute[241566]: 2025-12-03 21:32:27.509 241570 INFO nova.compute.claims [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Claim successful on node compute-0.ctlplane.example.com
Dec 03 21:32:27 compute-0 nova_compute[241566]: 2025-12-03 21:32:27.624 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 1.9986981750039173e-07 of space, bias 1.0, pg target 5.996094525011752e-05 quantized to 32 (current 32)
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006677743597888245 of space, bias 1.0, pg target 0.20033230793664736 quantized to 32 (current 32)
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 1.1589623851345146e-06 of space, bias 4.0, pg target 0.0013907548621614175 quantized to 16 (current 16)
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:32:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:32:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:32:28 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2052140997' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.157 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.163 241570 DEBUG nova.compute.provider_tree [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.182 241570 DEBUG nova.scheduler.client.report [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:32:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v806: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 2.5 KiB/s wr, 37 op/s
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.206 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.206 241570 DEBUG nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.269 241570 DEBUG nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.269 241570 DEBUG nova.network.neutron [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.305 241570 INFO nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.323 241570 DEBUG nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.363 241570 INFO nova.virt.block_device [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Booting with volume 74f6cb4b-c1f6-4650-97bb-811b731c0960 at /dev/vda
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.822 241570 DEBUG os_brick.utils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.100', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-0.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.824 241570 INFO oslo.privsep.daemon [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmp82yjnbzm/privsep.sock']
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.956 241570 DEBUG nova.network.neutron [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 03 21:32:28 compute-0 nova_compute[241566]: 2025-12-03 21:32:28.957 241570 DEBUG nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 03 21:32:29 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2052140997' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.588 241570 INFO oslo.privsep.daemon [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Spawned new privsep daemon via rootwrap
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.447 246262 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.455 246262 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.460 246262 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.461 246262 INFO oslo.privsep.daemon [-] privsep daemon running as pid 246262
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.591 246262 DEBUG oslo.privsep.daemon [-] privsep: reply[1895525e-2aab-48de-a2ca-51b3dfe3df0b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.711 246262 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.722 246262 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.723 246262 DEBUG oslo.privsep.daemon [-] privsep: reply[7d88742c-bd70-4052-b586-9b8ae30e491f]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.724 246262 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.732 246262 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.732 246262 DEBUG oslo.privsep.daemon [-] privsep: reply[eeda5bc5-2d3c-490f-92ec-440b6f613311]: (4, ('InitiatorName=iqn.1994-05.com.redhat:9ad6421bbbcd', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.734 246262 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.747 246262 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.748 246262 DEBUG oslo.privsep.daemon [-] privsep: reply[ddee79ae-d7ef-493f-8d41-2ab47fe31233]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.749 246262 DEBUG oslo.privsep.daemon [-] privsep: reply[6fc2d889-d255-4ab6-8358-cfdc2d408fa1]: (4, 'fe808748-0a27-4a3c-9875-a9777da5fa17') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.750 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.774 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.777 241570 DEBUG os_brick.initiator.connectors.lightos [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.777 241570 DEBUG os_brick.initiator.connectors.lightos [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.778 241570 DEBUG os_brick.initiator.connectors.lightos [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:bf3e0a14-a5f8-4123-aa26-e7cad37b879a dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.778 241570 DEBUG os_brick.utils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] <== get_connector_properties: return (955ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.100', 'host': 'compute-0.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:9ad6421bbbcd', 'do_local_attach': False, 'nvme_hostid': 'bf3e0a14-a5f8-4123-aa26-e7cad37b879a', 'system uuid': 'fe808748-0a27-4a3c-9875-a9777da5fa17', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:bf3e0a14-a5f8-4123-aa26-e7cad37b879a', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203
Dec 03 21:32:29 compute-0 nova_compute[241566]: 2025-12-03 21:32:29.779 241570 DEBUG nova.virt.block_device [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Updating existing volume attachment record: 4cd2ac45-ce27-450a-b9c6-58e7e1803ad9 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631
Dec 03 21:32:30 compute-0 ceph-mon[75204]: pgmap v806: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 2.5 KiB/s wr, 37 op/s
Dec 03 21:32:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v807: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 15 KiB/s rd, 1.7 KiB/s wr, 20 op/s
Dec 03 21:32:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 03 21:32:30 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3498212241' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.961 241570 DEBUG nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.964 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.964 241570 INFO nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Creating image(s)
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.965 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.966 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Ensure instance console log exists: /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.966 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.967 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.967 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.969 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'delete_on_termination': True, 'disk_bus': 'virtio', 'mount_device': '/dev/vda', 'attachment_id': '4cd2ac45-ce27-450a-b9c6-58e7e1803ad9', 'device_type': 'disk', 'boot_index': 0, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-74f6cb4b-c1f6-4650-97bb-811b731c0960', 'hosts': ['192.168.122.100'], 'ports': ['6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '74f6cb4b-c1f6-4650-97bb-811b731c0960', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'ab23bcbe-2091-4277-8f17-e9554b017c36', 'attached_at': '', 'detached_at': '', 'volume_id': '74f6cb4b-c1f6-4650-97bb-811b731c0960', 'serial': '74f6cb4b-c1f6-4650-97bb-811b731c0960'}, 'guest_format': None, 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.973 241570 WARNING nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.978 241570 DEBUG nova.virt.libvirt.host [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.979 241570 DEBUG nova.virt.libvirt.host [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.982 241570 DEBUG nova.virt.libvirt.host [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.983 241570 DEBUG nova.virt.libvirt.host [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.984 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.984 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T21:30:47Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e4062fae-6b9c-487c-944b-c7d7fb777ccb',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.985 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.985 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.986 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.986 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.987 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.987 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.988 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.988 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.989 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 03 21:32:30 compute-0 nova_compute[241566]: 2025-12-03 21:32:30.989 241570 DEBUG nova.virt.hardware [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.025 241570 DEBUG nova.storage.rbd_utils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image ab23bcbe-2091-4277-8f17-e9554b017c36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.030 241570 DEBUG nova.privsep.utils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.031 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:31 compute-0 ceph-mon[75204]: pgmap v807: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 15 KiB/s rd, 1.7 KiB/s wr, 20 op/s
Dec 03 21:32:31 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3498212241' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:32:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e92 do_prune osdmap full prune enabled
Dec 03 21:32:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e93 e93: 3 total, 3 up, 3 in
Dec 03 21:32:31 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e93: 3 total, 3 up, 3 in
Dec 03 21:32:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 03 21:32:31 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1291370575' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.602 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.604 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.605 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.606 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:31 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 03 21:32:31 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.682 241570 DEBUG nova.objects.instance [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lazy-loading 'pci_devices' on Instance uuid ab23bcbe-2091-4277-8f17-e9554b017c36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.696 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] End _get_guest_xml xml=<domain type="kvm">
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <uuid>ab23bcbe-2091-4277-8f17-e9554b017c36</uuid>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <name>instance-00000001</name>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <memory>131072</memory>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <vcpu>1</vcpu>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <metadata>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <nova:name>instance-depend-image</nova:name>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <nova:creationTime>2025-12-03 21:32:30</nova:creationTime>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <nova:flavor name="m1.nano">
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <nova:memory>128</nova:memory>
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <nova:disk>1</nova:disk>
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <nova:swap>0</nova:swap>
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <nova:vcpus>1</nova:vcpus>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       </nova:flavor>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <nova:owner>
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <nova:user uuid="bc25c6732c60417d92846f1367ba9a4f">tempest-ImageDependencyTests-323442990-project-member</nova:user>
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <nova:project uuid="11092597966341b0915e8c2a6530e568">tempest-ImageDependencyTests-323442990</nova:project>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       </nova:owner>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <nova:ports/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     </nova:instance>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   </metadata>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <sysinfo type="smbios">
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <system>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <entry name="manufacturer">RDO</entry>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <entry name="product">OpenStack Compute</entry>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <entry name="serial">ab23bcbe-2091-4277-8f17-e9554b017c36</entry>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <entry name="uuid">ab23bcbe-2091-4277-8f17-e9554b017c36</entry>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <entry name="family">Virtual Machine</entry>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     </system>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   </sysinfo>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <os>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <boot dev="hd"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <smbios mode="sysinfo"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   </os>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <features>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <acpi/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <apic/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <vmcoreinfo/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   </features>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <clock offset="utc">
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <timer name="hpet" present="no"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   </clock>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <cpu mode="host-model" match="exact">
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   </cpu>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   <devices>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <disk type="network" device="cdrom">
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <driver type="raw" cache="none"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <source protocol="rbd" name="vms/ab23bcbe-2091-4277-8f17-e9554b017c36_disk.config">
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <host name="192.168.122.100" port="6789"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       </source>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <auth username="openstack">
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <secret type="ceph" uuid="c21de27e-a7fd-594b-8324-0697ba9aab3a"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       </auth>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <target dev="sda" bus="sata"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     </disk>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <disk type="network" device="disk">
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <source protocol="rbd" name="volumes/volume-74f6cb4b-c1f6-4650-97bb-811b731c0960">
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <host name="192.168.122.100" port="6789"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       </source>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <auth username="openstack">
Dec 03 21:32:31 compute-0 nova_compute[241566]:         <secret type="ceph" uuid="c21de27e-a7fd-594b-8324-0697ba9aab3a"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       </auth>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <target dev="vda" bus="virtio"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <serial>74f6cb4b-c1f6-4650-97bb-811b731c0960</serial>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     </disk>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <serial type="pty">
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <log file="/var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36/console.log" append="off"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     </serial>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <video>
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <model type="virtio"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     </video>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <input type="tablet" bus="usb"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <rng model="virtio">
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <backend model="random">/dev/urandom</backend>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     </rng>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <controller type="usb" index="0"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     <memballoon model="virtio">
Dec 03 21:32:31 compute-0 nova_compute[241566]:       <stats period="10"/>
Dec 03 21:32:31 compute-0 nova_compute[241566]:     </memballoon>
Dec 03 21:32:31 compute-0 nova_compute[241566]:   </devices>
Dec 03 21:32:31 compute-0 nova_compute[241566]: </domain>
Dec 03 21:32:31 compute-0 nova_compute[241566]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.748 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.749 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.749 241570 INFO nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Using config drive
Dec 03 21:32:31 compute-0 nova_compute[241566]: 2025-12-03 21:32:31.780 241570 DEBUG nova.storage.rbd_utils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image ab23bcbe-2091-4277-8f17-e9554b017c36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:32 compute-0 ceph-mon[75204]: osdmap e93: 3 total, 3 up, 3 in
Dec 03 21:32:32 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1291370575' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:32:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v809: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 1.5 KiB/s wr, 18 op/s
Dec 03 21:32:32 compute-0 nova_compute[241566]: 2025-12-03 21:32:32.854 241570 INFO nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Creating config drive at /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36/disk.config
Dec 03 21:32:32 compute-0 nova_compute[241566]: 2025-12-03 21:32:32.860 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmoyd8gii execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:33 compute-0 nova_compute[241566]: 2025-12-03 21:32:33.003 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmoyd8gii" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:33 compute-0 nova_compute[241566]: 2025-12-03 21:32:33.029 241570 DEBUG nova.storage.rbd_utils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image ab23bcbe-2091-4277-8f17-e9554b017c36_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:33 compute-0 nova_compute[241566]: 2025-12-03 21:32:33.032 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36/disk.config ab23bcbe-2091-4277-8f17-e9554b017c36_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e93 do_prune osdmap full prune enabled
Dec 03 21:32:33 compute-0 ceph-mon[75204]: pgmap v809: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 1.5 KiB/s wr, 18 op/s
Dec 03 21:32:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e94 e94: 3 total, 3 up, 3 in
Dec 03 21:32:33 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e94: 3 total, 3 up, 3 in
Dec 03 21:32:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e94 do_prune osdmap full prune enabled
Dec 03 21:32:34 compute-0 ceph-mon[75204]: osdmap e94: 3 total, 3 up, 3 in
Dec 03 21:32:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v811: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:32:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e95 e95: 3 total, 3 up, 3 in
Dec 03 21:32:34 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e95: 3 total, 3 up, 3 in
Dec 03 21:32:34 compute-0 nova_compute[241566]: 2025-12-03 21:32:34.284 241570 DEBUG oslo_concurrency.processutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36/disk.config ab23bcbe-2091-4277-8f17-e9554b017c36_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.251s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:34 compute-0 nova_compute[241566]: 2025-12-03 21:32:34.285 241570 INFO nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Deleting local config drive /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36/disk.config because it was imported into RBD.
Dec 03 21:32:34 compute-0 systemd-machined[203931]: New machine qemu-1-instance-00000001.
Dec 03 21:32:34 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.042 241570 DEBUG nova.virt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Emitting event <LifecycleEvent: 1764797555.0419087, ab23bcbe-2091-4277-8f17-e9554b017c36 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.044 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] VM Resumed (Lifecycle Event)
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.050 241570 DEBUG nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.051 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.056 241570 INFO nova.virt.libvirt.driver [-] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Instance spawned successfully.
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.057 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.105 241570 DEBUG nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.120 241570 DEBUG nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.124 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.125 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.126 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.127 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.128 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.129 241570 DEBUG nova.virt.libvirt.driver [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.163 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.164 241570 DEBUG nova.virt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Emitting event <LifecycleEvent: 1764797555.0435162, ab23bcbe-2091-4277-8f17-e9554b017c36 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.165 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] VM Started (Lifecycle Event)
Dec 03 21:32:35 compute-0 ceph-mon[75204]: pgmap v811: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:32:35 compute-0 ceph-mon[75204]: osdmap e95: 3 total, 3 up, 3 in
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.192 241570 DEBUG nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.197 241570 DEBUG nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.205 241570 INFO nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Took 4.24 seconds to spawn the instance on the hypervisor.
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.207 241570 DEBUG nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.223 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.277 241570 INFO nova.compute.manager [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Took 7.81 seconds to build instance.
Dec 03 21:32:35 compute-0 nova_compute[241566]: 2025-12-03 21:32:35.299 241570 DEBUG oslo_concurrency.lockutils [None req-74cc14fe-fcfb-4dad-9e1d-36a690c180a3 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "ab23bcbe-2091-4277-8f17-e9554b017c36" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:36 compute-0 podman[246446]: 2025-12-03 21:32:36.139260459 +0000 UTC m=+0.068060735 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:32:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:36 compute-0 podman[246445]: 2025-12-03 21:32:36.155879634 +0000 UTC m=+0.085066830 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible)
Dec 03 21:32:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v813: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 25 KiB/s wr, 26 op/s
Dec 03 21:32:37 compute-0 ceph-mon[75204]: pgmap v813: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 25 KiB/s wr, 26 op/s
Dec 03 21:32:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v814: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 22 KiB/s wr, 22 op/s
Dec 03 21:32:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e95 do_prune osdmap full prune enabled
Dec 03 21:32:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e96 e96: 3 total, 3 up, 3 in
Dec 03 21:32:38 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e96: 3 total, 3 up, 3 in
Dec 03 21:32:39 compute-0 ceph-mon[75204]: pgmap v814: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 22 KiB/s wr, 22 op/s
Dec 03 21:32:39 compute-0 ceph-mon[75204]: osdmap e96: 3 total, 3 up, 3 in
Dec 03 21:32:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e96 do_prune osdmap full prune enabled
Dec 03 21:32:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e97 e97: 3 total, 3 up, 3 in
Dec 03 21:32:39 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e97: 3 total, 3 up, 3 in
Dec 03 21:32:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v817: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 25 KiB/s wr, 26 op/s
Dec 03 21:32:40 compute-0 ceph-mon[75204]: osdmap e97: 3 total, 3 up, 3 in
Dec 03 21:32:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e97 do_prune osdmap full prune enabled
Dec 03 21:32:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e98 e98: 3 total, 3 up, 3 in
Dec 03 21:32:41 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e98: 3 total, 3 up, 3 in
Dec 03 21:32:41 compute-0 ceph-mon[75204]: pgmap v817: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 25 KiB/s wr, 26 op/s
Dec 03 21:32:41 compute-0 ceph-mon[75204]: osdmap e98: 3 total, 3 up, 3 in
Dec 03 21:32:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v819: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 76 KiB/s rd, 3.7 KiB/s wr, 97 op/s
Dec 03 21:32:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e98 do_prune osdmap full prune enabled
Dec 03 21:32:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e99 e99: 3 total, 3 up, 3 in
Dec 03 21:32:42 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e99: 3 total, 3 up, 3 in
Dec 03 21:32:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e99 do_prune osdmap full prune enabled
Dec 03 21:32:43 compute-0 ceph-mon[75204]: pgmap v819: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 76 KiB/s rd, 3.7 KiB/s wr, 97 op/s
Dec 03 21:32:43 compute-0 ceph-mon[75204]: osdmap e99: 3 total, 3 up, 3 in
Dec 03 21:32:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e100 e100: 3 total, 3 up, 3 in
Dec 03 21:32:43 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e100: 3 total, 3 up, 3 in
Dec 03 21:32:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v822: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 92 KiB/s rd, 4.4 KiB/s wr, 117 op/s
Dec 03 21:32:44 compute-0 ceph-mon[75204]: osdmap e100: 3 total, 3 up, 3 in
Dec 03 21:32:45 compute-0 ceph-mon[75204]: pgmap v822: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 92 KiB/s rd, 4.4 KiB/s wr, 117 op/s
Dec 03 21:32:45 compute-0 nova_compute[241566]: 2025-12-03 21:32:45.783 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "b947bb8b-dad6-41ce-9f54-836a10775855" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:45 compute-0 nova_compute[241566]: 2025-12-03 21:32:45.784 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "b947bb8b-dad6-41ce-9f54-836a10775855" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:45 compute-0 nova_compute[241566]: 2025-12-03 21:32:45.804 241570 DEBUG nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 03 21:32:45 compute-0 nova_compute[241566]: 2025-12-03 21:32:45.898 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:45 compute-0 nova_compute[241566]: 2025-12-03 21:32:45.898 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:45 compute-0 nova_compute[241566]: 2025-12-03 21:32:45.925 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 03 21:32:45 compute-0 nova_compute[241566]: 2025-12-03 21:32:45.925 241570 INFO nova.compute.claims [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Claim successful on node compute-0.ctlplane.example.com
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.114 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v823: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 8.0 KiB/s wr, 166 op/s
Dec 03 21:32:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:32:46 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1613450328' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.641 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.650 241570 DEBUG nova.compute.provider_tree [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.672 241570 DEBUG nova.scheduler.client.report [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.703 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.704 241570 DEBUG nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.774 241570 DEBUG nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.775 241570 DEBUG nova.network.neutron [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.801 241570 INFO nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.831 241570 DEBUG nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.935 241570 DEBUG nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.938 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.938 241570 INFO nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Creating image(s)
Dec 03 21:32:46 compute-0 nova_compute[241566]: 2025-12-03 21:32:46.970 241570 DEBUG nova.storage.rbd_utils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image b947bb8b-dad6-41ce-9f54-836a10775855_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.003 241570 DEBUG nova.storage.rbd_utils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image b947bb8b-dad6-41ce-9f54-836a10775855_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.031 241570 DEBUG nova.storage.rbd_utils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image b947bb8b-dad6-41ce-9f54-836a10775855_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.035 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "14c656cff84150942006df12a6d997e516fe4350" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.036 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "14c656cff84150942006df12a6d997e516fe4350" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.146 241570 DEBUG nova.network.neutron [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.147 241570 DEBUG nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.235 241570 DEBUG nova.virt.libvirt.imagebackend [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Image locations are: [{'url': 'rbd://c21de27e-a7fd-594b-8324-0697ba9aab3a/images/d8f4b089-9930-48c5-890a-b63bbf40a7c4/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://c21de27e-a7fd-594b-8324-0697ba9aab3a/images/d8f4b089-9930-48c5-890a-b63bbf40a7c4/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Dec 03 21:32:47 compute-0 ceph-mon[75204]: pgmap v823: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 125 KiB/s rd, 8.0 KiB/s wr, 166 op/s
Dec 03 21:32:47 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1613450328' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.307 241570 DEBUG nova.virt.libvirt.imagebackend [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Selected location: {'url': 'rbd://c21de27e-a7fd-594b-8324-0697ba9aab3a/images/d8f4b089-9930-48c5-890a-b63bbf40a7c4/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.308 241570 DEBUG nova.storage.rbd_utils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] cloning images/d8f4b089-9930-48c5-890a-b63bbf40a7c4@snap to None/b947bb8b-dad6-41ce-9f54-836a10775855_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.430 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "14c656cff84150942006df12a6d997e516fe4350" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.598 241570 DEBUG nova.storage.rbd_utils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] resizing rbd image b947bb8b-dad6-41ce-9f54-836a10775855_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.677 241570 DEBUG nova.objects.instance [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lazy-loading 'migration_context' on Instance uuid b947bb8b-dad6-41ce-9f54-836a10775855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.696 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.696 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Ensure instance console log exists: /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.697 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.697 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.698 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.701 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='fa160df59a096a4d68dea663d126b5de',container_format='bare',created_at=2025-12-03T21:32:42Z,direct_url=<?>,disk_format='raw',id=d8f4b089-9930-48c5-890a-b63bbf40a7c4,min_disk=0,min_ram=0,name='tempest-image-dependency-test-1153271457',owner='11092597966341b0915e8c2a6530e568',properties=ImageMetaProps,protected=<?>,size=1024,status='active',tags=<?>,updated_at=2025-12-03T21:32:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'encryption_secret_uuid': None, 'encryption_format': None, 'guest_format': None, 'disk_bus': 'virtio', 'size': 0, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'device_type': 'disk', 'image_id': 'd8f4b089-9930-48c5-890a-b63bbf40a7c4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.706 241570 WARNING nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.711 241570 DEBUG nova.virt.libvirt.host [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.712 241570 DEBUG nova.virt.libvirt.host [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.716 241570 DEBUG nova.virt.libvirt.host [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.717 241570 DEBUG nova.virt.libvirt.host [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.717 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.718 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T21:30:47Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e4062fae-6b9c-487c-944b-c7d7fb777ccb',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='fa160df59a096a4d68dea663d126b5de',container_format='bare',created_at=2025-12-03T21:32:42Z,direct_url=<?>,disk_format='raw',id=d8f4b089-9930-48c5-890a-b63bbf40a7c4,min_disk=0,min_ram=0,name='tempest-image-dependency-test-1153271457',owner='11092597966341b0915e8c2a6530e568',properties=ImageMetaProps,protected=<?>,size=1024,status='active',tags=<?>,updated_at=2025-12-03T21:32:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.718 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.718 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.719 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.719 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.719 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.719 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.720 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.720 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.720 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.721 241570 DEBUG nova.virt.hardware [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 03 21:32:47 compute-0 nova_compute[241566]: 2025-12-03 21:32:47.724 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v824: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 42 KiB/s rd, 3.7 KiB/s wr, 59 op/s
Dec 03 21:32:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 03 21:32:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2672130748' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.269 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.546s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.295 241570 DEBUG nova.storage.rbd_utils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image b947bb8b-dad6-41ce-9f54-836a10775855_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.300 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:48 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2672130748' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:32:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Dec 03 21:32:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/826435445' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.842 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.542s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.844 241570 DEBUG nova.objects.instance [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lazy-loading 'pci_devices' on Instance uuid b947bb8b-dad6-41ce-9f54-836a10775855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.858 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] End _get_guest_xml xml=<domain type="kvm">
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <uuid>b947bb8b-dad6-41ce-9f54-836a10775855</uuid>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <name>instance-00000002</name>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <memory>131072</memory>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <vcpu>1</vcpu>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <metadata>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <nova:name>instance-depend-image</nova:name>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <nova:creationTime>2025-12-03 21:32:47</nova:creationTime>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <nova:flavor name="m1.nano">
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <nova:memory>128</nova:memory>
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <nova:disk>1</nova:disk>
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <nova:swap>0</nova:swap>
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <nova:ephemeral>0</nova:ephemeral>
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <nova:vcpus>1</nova:vcpus>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       </nova:flavor>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <nova:owner>
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <nova:user uuid="bc25c6732c60417d92846f1367ba9a4f">tempest-ImageDependencyTests-323442990-project-member</nova:user>
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <nova:project uuid="11092597966341b0915e8c2a6530e568">tempest-ImageDependencyTests-323442990</nova:project>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       </nova:owner>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <nova:root type="image" uuid="d8f4b089-9930-48c5-890a-b63bbf40a7c4"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <nova:ports/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     </nova:instance>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   </metadata>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <sysinfo type="smbios">
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <system>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <entry name="manufacturer">RDO</entry>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <entry name="product">OpenStack Compute</entry>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <entry name="serial">b947bb8b-dad6-41ce-9f54-836a10775855</entry>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <entry name="uuid">b947bb8b-dad6-41ce-9f54-836a10775855</entry>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <entry name="family">Virtual Machine</entry>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     </system>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   </sysinfo>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <os>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <boot dev="hd"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <smbios mode="sysinfo"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   </os>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <features>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <acpi/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <apic/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <vmcoreinfo/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   </features>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <clock offset="utc">
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <timer name="pit" tickpolicy="delay"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <timer name="hpet" present="no"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   </clock>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <cpu mode="host-model" match="exact">
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <topology sockets="1" cores="1" threads="1"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   </cpu>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   <devices>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <disk type="network" device="disk">
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <driver type="raw" cache="none"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <source protocol="rbd" name="vms/b947bb8b-dad6-41ce-9f54-836a10775855_disk">
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <host name="192.168.122.100" port="6789"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       </source>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <auth username="openstack">
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <secret type="ceph" uuid="c21de27e-a7fd-594b-8324-0697ba9aab3a"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       </auth>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <target dev="vda" bus="virtio"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     </disk>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <disk type="network" device="cdrom">
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <driver type="raw" cache="none"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <source protocol="rbd" name="vms/b947bb8b-dad6-41ce-9f54-836a10775855_disk.config">
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <host name="192.168.122.100" port="6789"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       </source>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <auth username="openstack">
Dec 03 21:32:48 compute-0 nova_compute[241566]:         <secret type="ceph" uuid="c21de27e-a7fd-594b-8324-0697ba9aab3a"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       </auth>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <target dev="sda" bus="sata"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     </disk>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <serial type="pty">
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <log file="/var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855/console.log" append="off"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     </serial>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <video>
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <model type="virtio"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     </video>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <input type="tablet" bus="usb"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <rng model="virtio">
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <backend model="random">/dev/urandom</backend>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     </rng>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="pci" model="pcie-root-port"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <controller type="usb" index="0"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     <memballoon model="virtio">
Dec 03 21:32:48 compute-0 nova_compute[241566]:       <stats period="10"/>
Dec 03 21:32:48 compute-0 nova_compute[241566]:     </memballoon>
Dec 03 21:32:48 compute-0 nova_compute[241566]:   </devices>
Dec 03 21:32:48 compute-0 nova_compute[241566]: </domain>
Dec 03 21:32:48 compute-0 nova_compute[241566]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.915 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.916 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.916 241570 INFO nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Using config drive
Dec 03 21:32:48 compute-0 nova_compute[241566]: 2025-12-03 21:32:48.936 241570 DEBUG nova.storage.rbd_utils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image b947bb8b-dad6-41ce-9f54-836a10775855_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:32:48.937 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:32:48.937 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:32:48.937 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:49 compute-0 nova_compute[241566]: 2025-12-03 21:32:49.125 241570 INFO nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Creating config drive at /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855/disk.config
Dec 03 21:32:49 compute-0 nova_compute[241566]: 2025-12-03 21:32:49.129 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu6nkyul8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:49 compute-0 nova_compute[241566]: 2025-12-03 21:32:49.262 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu6nkyul8" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:49 compute-0 nova_compute[241566]: 2025-12-03 21:32:49.301 241570 DEBUG nova.storage.rbd_utils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] rbd image b947bb8b-dad6-41ce-9f54-836a10775855_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Dec 03 21:32:49 compute-0 nova_compute[241566]: 2025-12-03 21:32:49.306 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855/disk.config b947bb8b-dad6-41ce-9f54-836a10775855_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:32:49 compute-0 ceph-mon[75204]: pgmap v824: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 42 KiB/s rd, 3.7 KiB/s wr, 59 op/s
Dec 03 21:32:49 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/826435445' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Dec 03 21:32:49 compute-0 nova_compute[241566]: 2025-12-03 21:32:49.626 241570 DEBUG oslo_concurrency.processutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855/disk.config b947bb8b-dad6-41ce-9f54-836a10775855_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:32:49 compute-0 nova_compute[241566]: 2025-12-03 21:32:49.628 241570 INFO nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Deleting local config drive /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855/disk.config because it was imported into RBD.
Dec 03 21:32:49 compute-0 systemd-machined[203931]: New machine qemu-2-instance-00000002.
Dec 03 21:32:49 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 03 21:32:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v825: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 3.2 KiB/s wr, 52 op/s
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.279 241570 DEBUG nova.virt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Emitting event <LifecycleEvent: 1764797570.2792146, b947bb8b-dad6-41ce-9f54-836a10775855 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.281 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] VM Resumed (Lifecycle Event)
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.287 241570 DEBUG nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.287 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.291 241570 INFO nova.virt.libvirt.driver [-] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Instance spawned successfully.
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.291 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.305 241570 DEBUG nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.311 241570 DEBUG nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.313 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.314 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.314 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.315 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.315 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.315 241570 DEBUG nova.virt.libvirt.driver [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.339 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.340 241570 DEBUG nova.virt.driver [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] Emitting event <LifecycleEvent: 1764797570.2802093, b947bb8b-dad6-41ce-9f54-836a10775855 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.340 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] VM Started (Lifecycle Event)
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.367 241570 DEBUG nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.369 241570 DEBUG nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.376 241570 INFO nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Took 3.44 seconds to spawn the instance on the hypervisor.
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.377 241570 DEBUG nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.399 241570 INFO nova.compute.manager [None req-3e600e4f-6e74-4ea2-8332-f744551ec151 - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.434 241570 INFO nova.compute.manager [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Took 4.58 seconds to build instance.
Dec 03 21:32:50 compute-0 nova_compute[241566]: 2025-12-03 21:32:50.461 241570 DEBUG oslo_concurrency.lockutils [None req-768b85f9-96b4-4f08-9f0a-22e2e26b4191 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "b947bb8b-dad6-41ce-9f54-836a10775855" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e100 do_prune osdmap full prune enabled
Dec 03 21:32:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e101 e101: 3 total, 3 up, 3 in
Dec 03 21:32:51 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e101: 3 total, 3 up, 3 in
Dec 03 21:32:51 compute-0 ceph-mon[75204]: pgmap v825: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 3.2 KiB/s wr, 52 op/s
Dec 03 21:32:51 compute-0 ceph-mon[75204]: osdmap e101: 3 total, 3 up, 3 in
Dec 03 21:32:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:32:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:32:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:32:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:32:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:32:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:32:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v827: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 21 KiB/s wr, 110 op/s
Dec 03 21:32:52 compute-0 nova_compute[241566]: 2025-12-03 21:32:52.682 241570 DEBUG nova.compute.manager [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:32:52 compute-0 nova_compute[241566]: 2025-12-03 21:32:52.733 241570 INFO nova.compute.manager [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] instance snapshotting
Dec 03 21:32:52 compute-0 nova_compute[241566]: 2025-12-03 21:32:52.967 241570 INFO nova.virt.libvirt.driver [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Beginning live snapshot process
Dec 03 21:32:53 compute-0 nova_compute[241566]: 2025-12-03 21:32:53.148 241570 DEBUG nova.storage.rbd_utils [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] creating snapshot(bb28ec91a50a49e38214c6d118aff05d) on rbd image(b947bb8b-dad6-41ce-9f54-836a10775855_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Dec 03 21:32:53 compute-0 podman[246879]: 2025-12-03 21:32:53.153281002 +0000 UTC m=+0.099072742 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 03 21:32:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e101 do_prune osdmap full prune enabled
Dec 03 21:32:53 compute-0 ceph-mon[75204]: pgmap v827: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 21 KiB/s wr, 110 op/s
Dec 03 21:32:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e102 e102: 3 total, 3 up, 3 in
Dec 03 21:32:53 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e102: 3 total, 3 up, 3 in
Dec 03 21:32:53 compute-0 nova_compute[241566]: 2025-12-03 21:32:53.623 241570 DEBUG nova.storage.rbd_utils [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] cloning vms/b947bb8b-dad6-41ce-9f54-836a10775855_disk@bb28ec91a50a49e38214c6d118aff05d to images/af22589e-230e-4308-9c23-43fa3e67646b clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Dec 03 21:32:53 compute-0 nova_compute[241566]: 2025-12-03 21:32:53.730 241570 DEBUG nova.storage.rbd_utils [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] flattening images/af22589e-230e-4308-9c23-43fa3e67646b flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Dec 03 21:32:53 compute-0 nova_compute[241566]: 2025-12-03 21:32:53.856 241570 DEBUG nova.storage.rbd_utils [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] removing snapshot(bb28ec91a50a49e38214c6d118aff05d) on rbd image(b947bb8b-dad6-41ce-9f54-836a10775855_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489
Dec 03 21:32:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v829: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 20 KiB/s wr, 70 op/s
Dec 03 21:32:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e102 do_prune osdmap full prune enabled
Dec 03 21:32:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e103 e103: 3 total, 3 up, 3 in
Dec 03 21:32:54 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e103: 3 total, 3 up, 3 in
Dec 03 21:32:54 compute-0 ceph-mon[75204]: osdmap e102: 3 total, 3 up, 3 in
Dec 03 21:32:54 compute-0 nova_compute[241566]: 2025-12-03 21:32:54.597 241570 DEBUG nova.storage.rbd_utils [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] creating snapshot(snap) on rbd image(af22589e-230e-4308-9c23-43fa3e67646b) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462
Dec 03 21:32:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e103 do_prune osdmap full prune enabled
Dec 03 21:32:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e104 e104: 3 total, 3 up, 3 in
Dec 03 21:32:55 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e104: 3 total, 3 up, 3 in
Dec 03 21:32:55 compute-0 ceph-mon[75204]: pgmap v829: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 20 KiB/s wr, 70 op/s
Dec 03 21:32:55 compute-0 ceph-mon[75204]: osdmap e103: 3 total, 3 up, 3 in
Dec 03 21:32:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:32:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v832: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 102 KiB/s rd, 5.8 KiB/s wr, 131 op/s
Dec 03 21:32:56 compute-0 ceph-mon[75204]: osdmap e104: 3 total, 3 up, 3 in
Dec 03 21:32:56 compute-0 nova_compute[241566]: 2025-12-03 21:32:56.966 241570 INFO nova.virt.libvirt.driver [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Snapshot image upload complete
Dec 03 21:32:56 compute-0 nova_compute[241566]: 2025-12-03 21:32:56.967 241570 INFO nova.compute.manager [None req-2fb1d622-ab36-4bf2-96ca-793271147ac1 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Took 4.23 seconds to snapshot the instance on the hypervisor.
Dec 03 21:32:57 compute-0 ceph-mon[75204]: pgmap v832: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 102 KiB/s rd, 5.8 KiB/s wr, 131 op/s
Dec 03 21:32:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v833: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 4.8 KiB/s wr, 109 op/s
Dec 03 21:32:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e104 do_prune osdmap full prune enabled
Dec 03 21:32:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e105 e105: 3 total, 3 up, 3 in
Dec 03 21:32:58 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e105: 3 total, 3 up, 3 in
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.348 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "b947bb8b-dad6-41ce-9f54-836a10775855" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.349 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "b947bb8b-dad6-41ce-9f54-836a10775855" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.349 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "b947bb8b-dad6-41ce-9f54-836a10775855-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.349 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "b947bb8b-dad6-41ce-9f54-836a10775855-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.349 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "b947bb8b-dad6-41ce-9f54-836a10775855-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.350 241570 INFO nova.compute.manager [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Terminating instance
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.351 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "refresh_cache-b947bb8b-dad6-41ce-9f54-836a10775855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.351 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquired lock "refresh_cache-b947bb8b-dad6-41ce-9f54-836a10775855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.351 241570 DEBUG nova.network.neutron [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 03 21:32:59 compute-0 ceph-mon[75204]: pgmap v833: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 4.8 KiB/s wr, 109 op/s
Dec 03 21:32:59 compute-0 ceph-mon[75204]: osdmap e105: 3 total, 3 up, 3 in
Dec 03 21:32:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:32:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3117375342' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:32:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:32:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3117375342' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:32:59 compute-0 nova_compute[241566]: 2025-12-03 21:32:59.868 241570 DEBUG nova.network.neutron [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.138 241570 DEBUG nova.network.neutron [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.160 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Releasing lock "refresh_cache-b947bb8b-dad6-41ce-9f54-836a10775855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.161 241570 DEBUG nova.compute.manager [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 03 21:33:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v835: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 4.8 KiB/s wr, 109 op/s
Dec 03 21:33:00 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 03 21:33:00 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 1.136s CPU time.
Dec 03 21:33:00 compute-0 systemd-machined[203931]: Machine qemu-2-instance-00000002 terminated.
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.390 241570 INFO nova.virt.libvirt.driver [-] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Instance destroyed successfully.
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.391 241570 DEBUG nova.objects.instance [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lazy-loading 'resources' on Instance uuid b947bb8b-dad6-41ce-9f54-836a10775855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 03 21:33:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e105 do_prune osdmap full prune enabled
Dec 03 21:33:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3117375342' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:33:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3117375342' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:33:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e106 e106: 3 total, 3 up, 3 in
Dec 03 21:33:00 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e106: 3 total, 3 up, 3 in
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.847 241570 INFO nova.virt.libvirt.driver [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Deleting instance files /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855_del
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.848 241570 INFO nova.virt.libvirt.driver [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Deletion of /var/lib/nova/instances/b947bb8b-dad6-41ce-9f54-836a10775855_del complete
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.894 241570 DEBUG nova.virt.libvirt.host [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.895 241570 INFO nova.virt.libvirt.host [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] UEFI support detected
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.897 241570 INFO nova.compute.manager [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Took 0.74 seconds to destroy the instance on the hypervisor.
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.897 241570 DEBUG oslo.service.loopingcall [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.898 241570 DEBUG nova.compute.manager [-] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 03 21:33:00 compute-0 nova_compute[241566]: 2025-12-03 21:33:00.898 241570 DEBUG nova.network.neutron [-] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 03 21:33:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e106 do_prune osdmap full prune enabled
Dec 03 21:33:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e107 e107: 3 total, 3 up, 3 in
Dec 03 21:33:01 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e107: 3 total, 3 up, 3 in
Dec 03 21:33:01 compute-0 ceph-mon[75204]: pgmap v835: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 4.8 KiB/s wr, 109 op/s
Dec 03 21:33:01 compute-0 ceph-mon[75204]: osdmap e106: 3 total, 3 up, 3 in
Dec 03 21:33:01 compute-0 ceph-mon[75204]: osdmap e107: 3 total, 3 up, 3 in
Dec 03 21:33:01 compute-0 nova_compute[241566]: 2025-12-03 21:33:01.855 241570 DEBUG nova.network.neutron [-] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 03 21:33:01 compute-0 nova_compute[241566]: 2025-12-03 21:33:01.883 241570 DEBUG nova.network.neutron [-] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 03 21:33:01 compute-0 nova_compute[241566]: 2025-12-03 21:33:01.905 241570 INFO nova.compute.manager [-] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Took 1.01 seconds to deallocate network for instance.
Dec 03 21:33:01 compute-0 nova_compute[241566]: 2025-12-03 21:33:01.961 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:33:01 compute-0 nova_compute[241566]: 2025-12-03 21:33:01.962 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:33:02 compute-0 nova_compute[241566]: 2025-12-03 21:33:02.066 241570 DEBUG oslo_concurrency.processutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:33:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v838: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 102 KiB/s rd, 4.7 KiB/s wr, 134 op/s
Dec 03 21:33:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:33:02 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/607795396' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:33:02 compute-0 nova_compute[241566]: 2025-12-03 21:33:02.640 241570 DEBUG oslo_concurrency.processutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.574s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:33:02 compute-0 nova_compute[241566]: 2025-12-03 21:33:02.645 241570 DEBUG nova.compute.provider_tree [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:33:02 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/607795396' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:33:02 compute-0 nova_compute[241566]: 2025-12-03 21:33:02.663 241570 DEBUG nova.scheduler.client.report [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:33:02 compute-0 nova_compute[241566]: 2025-12-03 21:33:02.693 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:33:02 compute-0 nova_compute[241566]: 2025-12-03 21:33:02.719 241570 INFO nova.scheduler.client.report [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Deleted allocations for instance b947bb8b-dad6-41ce-9f54-836a10775855
Dec 03 21:33:02 compute-0 nova_compute[241566]: 2025-12-03 21:33:02.782 241570 DEBUG oslo_concurrency.lockutils [None req-06daedfd-ae09-471c-98d4-00f7eb2090ac bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "b947bb8b-dad6-41ce-9f54-836a10775855" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.257 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "ab23bcbe-2091-4277-8f17-e9554b017c36" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.257 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "ab23bcbe-2091-4277-8f17-e9554b017c36" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.258 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "ab23bcbe-2091-4277-8f17-e9554b017c36-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.258 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "ab23bcbe-2091-4277-8f17-e9554b017c36-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.258 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "ab23bcbe-2091-4277-8f17-e9554b017c36-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.259 241570 INFO nova.compute.manager [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Terminating instance
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.260 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "refresh_cache-ab23bcbe-2091-4277-8f17-e9554b017c36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.260 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquired lock "refresh_cache-ab23bcbe-2091-4277-8f17-e9554b017c36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.260 241570 DEBUG nova.network.neutron [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 03 21:33:03 compute-0 ceph-mon[75204]: pgmap v838: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 102 KiB/s rd, 4.7 KiB/s wr, 134 op/s
Dec 03 21:33:03 compute-0 nova_compute[241566]: 2025-12-03 21:33:03.857 241570 DEBUG nova.network.neutron [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.047 241570 DEBUG nova.network.neutron [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.069 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Releasing lock "refresh_cache-ab23bcbe-2091-4277-8f17-e9554b017c36" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.070 241570 DEBUG nova.compute.manager [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 03 21:33:04 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 03 21:33:04 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 1.313s CPU time.
Dec 03 21:33:04 compute-0 systemd-machined[203931]: Machine qemu-1-instance-00000001 terminated.
Dec 03 21:33:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v839: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 102 KiB/s rd, 4.7 KiB/s wr, 134 op/s
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.296 241570 INFO nova.virt.libvirt.driver [-] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Instance destroyed successfully.
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.296 241570 DEBUG nova.objects.instance [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lazy-loading 'resources' on Instance uuid ab23bcbe-2091-4277-8f17-e9554b017c36 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.488 241570 INFO nova.virt.libvirt.driver [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Deleting instance files /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36_del
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.489 241570 INFO nova.virt.libvirt.driver [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Deletion of /var/lib/nova/instances/ab23bcbe-2091-4277-8f17-e9554b017c36_del complete
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.566 241570 INFO nova.compute.manager [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Took 0.50 seconds to destroy the instance on the hypervisor.
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.566 241570 DEBUG oslo.service.loopingcall [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.567 241570 DEBUG nova.compute.manager [-] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.567 241570 DEBUG nova.network.neutron [-] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.849 241570 DEBUG nova.network.neutron [-] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.862 241570 DEBUG nova.network.neutron [-] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 03 21:33:04 compute-0 nova_compute[241566]: 2025-12-03 21:33:04.871 241570 INFO nova.compute.manager [-] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Took 0.30 seconds to deallocate network for instance.
Dec 03 21:33:05 compute-0 nova_compute[241566]: 2025-12-03 21:33:05.122 241570 INFO nova.compute.manager [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Took 0.25 seconds to detach 1 volumes for instance.
Dec 03 21:33:05 compute-0 nova_compute[241566]: 2025-12-03 21:33:05.124 241570 DEBUG nova.compute.manager [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Deleting volume: 74f6cb4b-c1f6-4650-97bb-811b731c0960 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217
Dec 03 21:33:05 compute-0 nova_compute[241566]: 2025-12-03 21:33:05.579 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:33:05 compute-0 nova_compute[241566]: 2025-12-03 21:33:05.580 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:33:05 compute-0 nova_compute[241566]: 2025-12-03 21:33:05.644 241570 DEBUG oslo_concurrency.processutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:33:05 compute-0 ceph-mon[75204]: pgmap v839: 177 pgs: 177 active+clean; 42 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 102 KiB/s rd, 4.7 KiB/s wr, 134 op/s
Dec 03 21:33:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:33:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2290692199' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:33:06 compute-0 nova_compute[241566]: 2025-12-03 21:33:06.141 241570 DEBUG oslo_concurrency.processutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:33:06 compute-0 nova_compute[241566]: 2025-12-03 21:33:06.146 241570 DEBUG nova.compute.provider_tree [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:33:06 compute-0 nova_compute[241566]: 2025-12-03 21:33:06.158 241570 DEBUG nova.scheduler.client.report [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:33:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e107 do_prune osdmap full prune enabled
Dec 03 21:33:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e108 e108: 3 total, 3 up, 3 in
Dec 03 21:33:06 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e108: 3 total, 3 up, 3 in
Dec 03 21:33:06 compute-0 nova_compute[241566]: 2025-12-03 21:33:06.180 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:33:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v841: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 167 KiB/s rd, 7.3 KiB/s wr, 218 op/s
Dec 03 21:33:06 compute-0 nova_compute[241566]: 2025-12-03 21:33:06.206 241570 INFO nova.scheduler.client.report [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Deleted allocations for instance ab23bcbe-2091-4277-8f17-e9554b017c36
Dec 03 21:33:06 compute-0 nova_compute[241566]: 2025-12-03 21:33:06.259 241570 DEBUG oslo_concurrency.lockutils [None req-974e43e7-7aec-4d08-9363-359f6d9daf79 bc25c6732c60417d92846f1367ba9a4f 11092597966341b0915e8c2a6530e568 - - default default] Lock "ab23bcbe-2091-4277-8f17-e9554b017c36" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:33:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:33:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1178596790' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:33:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:33:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1178596790' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:33:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2290692199' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:33:06 compute-0 ceph-mon[75204]: osdmap e108: 3 total, 3 up, 3 in
Dec 03 21:33:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1178596790' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:33:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1178596790' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:33:07 compute-0 podman[247137]: 2025-12-03 21:33:07.152255978 +0000 UTC m=+0.083216456 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 03 21:33:07 compute-0 podman[247136]: 2025-12-03 21:33:07.159669777 +0000 UTC m=+0.093324228 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec 03 21:33:07 compute-0 ceph-mon[75204]: pgmap v841: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 167 KiB/s rd, 7.3 KiB/s wr, 218 op/s
Dec 03 21:33:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v842: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 113 KiB/s rd, 4.4 KiB/s wr, 144 op/s
Dec 03 21:33:09 compute-0 ceph-mon[75204]: pgmap v842: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 113 KiB/s rd, 4.4 KiB/s wr, 144 op/s
Dec 03 21:33:09 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:33:09.936 151937 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '4a:b3:fa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:85:3a:67:f5:74'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 03 21:33:09 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:33:09.937 151937 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 03 21:33:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v843: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 43 KiB/s rd, 1.8 KiB/s wr, 55 op/s
Dec 03 21:33:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e108 do_prune osdmap full prune enabled
Dec 03 21:33:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e109 e109: 3 total, 3 up, 3 in
Dec 03 21:33:11 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e109: 3 total, 3 up, 3 in
Dec 03 21:33:11 compute-0 ceph-mon[75204]: pgmap v843: 177 pgs: 177 active+clean; 41 MiB data, 176 MiB used, 60 GiB / 60 GiB avail; 43 KiB/s rd, 1.8 KiB/s wr, 55 op/s
Dec 03 21:33:11 compute-0 ceph-mon[75204]: osdmap e109: 3 total, 3 up, 3 in
Dec 03 21:33:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v845: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 2.6 KiB/s wr, 72 op/s
Dec 03 21:33:13 compute-0 ceph-mon[75204]: pgmap v845: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 2.6 KiB/s wr, 72 op/s
Dec 03 21:33:13 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:33:13.940 151937 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f27c01e7-5b62-4209-a664-3ae50b74644d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 21:33:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v846: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 5.5 KiB/s rd, 638 B/s wr, 9 op/s
Dec 03 21:33:15 compute-0 nova_compute[241566]: 2025-12-03 21:33:15.386 241570 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764797580.3855567, b947bb8b-dad6-41ce-9f54-836a10775855 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 03 21:33:15 compute-0 nova_compute[241566]: 2025-12-03 21:33:15.387 241570 INFO nova.compute.manager [-] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] VM Stopped (Lifecycle Event)
Dec 03 21:33:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v847: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 4.3 KiB/s rd, 504 B/s wr, 7 op/s
Dec 03 21:33:16 compute-0 nova_compute[241566]: 2025-12-03 21:33:16.350 241570 DEBUG nova.compute.manager [None req-b0abebbc-23df-47c7-a914-957ac118e66d - - - - - -] [instance: b947bb8b-dad6-41ce-9f54-836a10775855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:33:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e109 do_prune osdmap full prune enabled
Dec 03 21:33:16 compute-0 ceph-mon[75204]: pgmap v846: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 5.5 KiB/s rd, 638 B/s wr, 9 op/s
Dec 03 21:33:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 e110: 3 total, 3 up, 3 in
Dec 03 21:33:16 compute-0 ceph-mon[75204]: log_channel(cluster) log [DBG] : osdmap e110: 3 total, 3 up, 3 in
Dec 03 21:33:17 compute-0 ceph-mon[75204]: pgmap v847: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 4.3 KiB/s rd, 504 B/s wr, 7 op/s
Dec 03 21:33:17 compute-0 ceph-mon[75204]: osdmap e110: 3 total, 3 up, 3 in
Dec 03 21:33:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v849: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 5.4 KiB/s rd, 628 B/s wr, 9 op/s
Dec 03 21:33:19 compute-0 nova_compute[241566]: 2025-12-03 21:33:19.295 241570 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764797584.293983, ab23bcbe-2091-4277-8f17-e9554b017c36 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 03 21:33:19 compute-0 nova_compute[241566]: 2025-12-03 21:33:19.296 241570 INFO nova.compute.manager [-] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] VM Stopped (Lifecycle Event)
Dec 03 21:33:19 compute-0 nova_compute[241566]: 2025-12-03 21:33:19.326 241570 DEBUG nova.compute.manager [None req-1cb47d76-34be-4bac-8a2c-e89955c35793 - - - - - -] [instance: ab23bcbe-2091-4277-8f17-e9554b017c36] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 03 21:33:19 compute-0 ceph-mon[75204]: pgmap v849: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail; 5.4 KiB/s rd, 628 B/s wr, 9 op/s
Dec 03 21:33:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v850: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:20 compute-0 nova_compute[241566]: 2025-12-03 21:33:20.488 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:20 compute-0 nova_compute[241566]: 2025-12-03 21:33:20.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:33:21
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['images', 'vms', 'cephfs.cephfs.data', 'backups', 'volumes', 'cephfs.cephfs.meta', '.mgr']
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:33:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:21 compute-0 ceph-mon[75204]: pgmap v850: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.551 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.573 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.573 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.574 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.574 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.575 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:21 compute-0 nova_compute[241566]: 2025-12-03 21:33:21.575 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:33:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:33:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v851: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:22 compute-0 nova_compute[241566]: 2025-12-03 21:33:22.553 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:22 compute-0 sudo[247176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:33:22 compute-0 sudo[247176]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:22 compute-0 sudo[247176]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:22 compute-0 sudo[247201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:33:22 compute-0 sudo[247201]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:23 compute-0 ceph-mon[75204]: pgmap v851: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:23 compute-0 sudo[247201]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:33:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:33:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:33:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:33:23 compute-0 nova_compute[241566]: 2025-12-03 21:33:23.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:33:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:33:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:33:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:33:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:33:23 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:33:23 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:33:23 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:33:23 compute-0 nova_compute[241566]: 2025-12-03 21:33:23.582 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:33:23 compute-0 nova_compute[241566]: 2025-12-03 21:33:23.583 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:33:23 compute-0 nova_compute[241566]: 2025-12-03 21:33:23.583 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:33:23 compute-0 nova_compute[241566]: 2025-12-03 21:33:23.583 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:33:23 compute-0 nova_compute[241566]: 2025-12-03 21:33:23.584 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:33:23 compute-0 sudo[247257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:33:23 compute-0 sudo[247257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:23 compute-0 sudo[247257]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:23 compute-0 sudo[247284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:33:23 compute-0 sudo[247284]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:23 compute-0 podman[247282]: 2025-12-03 21:33:23.813369507 +0000 UTC m=+0.122275267 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec 03 21:33:24 compute-0 podman[247366]: 2025-12-03 21:33:24.083276898 +0000 UTC m=+0.060872797 container create 6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_galois, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:33:24 compute-0 systemd[1]: Started libpod-conmon-6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980.scope.
Dec 03 21:33:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:33:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3406652418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:33:24 compute-0 podman[247366]: 2025-12-03 21:33:24.051301149 +0000 UTC m=+0.028897108 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.163 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.579s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:33:24 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:33:24 compute-0 podman[247366]: 2025-12-03 21:33:24.192458091 +0000 UTC m=+0.170054050 container init 6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_galois, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:33:24 compute-0 podman[247366]: 2025-12-03 21:33:24.206450437 +0000 UTC m=+0.184046306 container start 6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_galois, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:33:24 compute-0 podman[247366]: 2025-12-03 21:33:24.210016693 +0000 UTC m=+0.187612592 container attach 6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_galois, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:33:24 compute-0 romantic_galois[247383]: 167 167
Dec 03 21:33:24 compute-0 systemd[1]: libpod-6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980.scope: Deactivated successfully.
Dec 03 21:33:24 compute-0 conmon[247383]: conmon 6d40f2bd1e994b59e60f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980.scope/container/memory.events
Dec 03 21:33:24 compute-0 podman[247366]: 2025-12-03 21:33:24.217954696 +0000 UTC m=+0.195550575 container died 6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_galois, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:33:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-0fdda0ee0e29b82c15256f60bc9bee099adeafd9a7a4237910160548a6ba1361-merged.mount: Deactivated successfully.
Dec 03 21:33:24 compute-0 podman[247366]: 2025-12-03 21:33:24.26948037 +0000 UTC m=+0.247076249 container remove 6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=romantic_galois, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:33:24 compute-0 systemd[1]: libpod-conmon-6d40f2bd1e994b59e60f82082ae0713f772e7df6d474c68eff29cab836d95980.scope: Deactivated successfully.
Dec 03 21:33:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v852: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.363 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.364 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5165MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.364 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.365 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:33:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:33:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:33:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:33:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:33:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:33:24 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:33:24 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3406652418' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.420 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.421 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.444 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:33:24 compute-0 podman[247410]: 2025-12-03 21:33:24.449426596 +0000 UTC m=+0.041444865 container create c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_pare, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:33:24 compute-0 systemd[1]: Started libpod-conmon-c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705.scope.
Dec 03 21:33:24 compute-0 podman[247410]: 2025-12-03 21:33:24.427855646 +0000 UTC m=+0.019873955 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:33:24 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:33:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb256fb5a38a2c9b9952de8683b236616556cf5709e3b98e4c854fff4787fc36/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb256fb5a38a2c9b9952de8683b236616556cf5709e3b98e4c854fff4787fc36/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb256fb5a38a2c9b9952de8683b236616556cf5709e3b98e4c854fff4787fc36/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb256fb5a38a2c9b9952de8683b236616556cf5709e3b98e4c854fff4787fc36/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb256fb5a38a2c9b9952de8683b236616556cf5709e3b98e4c854fff4787fc36/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:24 compute-0 podman[247410]: 2025-12-03 21:33:24.550773028 +0000 UTC m=+0.142791347 container init c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_pare, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:33:24 compute-0 podman[247410]: 2025-12-03 21:33:24.563701386 +0000 UTC m=+0.155719655 container start c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_pare, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:33:24 compute-0 podman[247410]: 2025-12-03 21:33:24.567865987 +0000 UTC m=+0.159884286 container attach c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_pare, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:33:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:33:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2227054090' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.963 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.971 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:33:24 compute-0 sleepy_pare[247429]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:33:24 compute-0 sleepy_pare[247429]: --> All data devices are unavailable
Dec 03 21:33:24 compute-0 nova_compute[241566]: 2025-12-03 21:33:24.987 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:33:25 compute-0 systemd[1]: libpod-c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705.scope: Deactivated successfully.
Dec 03 21:33:25 compute-0 podman[247410]: 2025-12-03 21:33:25.007126739 +0000 UTC m=+0.599145028 container died c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_pare, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:33:25 compute-0 nova_compute[241566]: 2025-12-03 21:33:25.012 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:33:25 compute-0 nova_compute[241566]: 2025-12-03 21:33:25.013 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:33:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-fb256fb5a38a2c9b9952de8683b236616556cf5709e3b98e4c854fff4787fc36-merged.mount: Deactivated successfully.
Dec 03 21:33:25 compute-0 podman[247410]: 2025-12-03 21:33:25.054381359 +0000 UTC m=+0.646399648 container remove c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_pare, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:33:25 compute-0 systemd[1]: libpod-conmon-c7cd500686839ef0d4460aaf32715067a764c66f27c0c0fb3b733d2e2f13d705.scope: Deactivated successfully.
Dec 03 21:33:25 compute-0 sudo[247284]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:25 compute-0 sudo[247484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:33:25 compute-0 sudo[247484]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:25 compute-0 sudo[247484]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:25 compute-0 sudo[247509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:33:25 compute-0 sudo[247509]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:25 compute-0 ceph-mon[75204]: pgmap v852: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:25 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2227054090' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:33:25 compute-0 podman[247546]: 2025-12-03 21:33:25.620710065 +0000 UTC m=+0.068400770 container create cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_keldysh, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:33:25 compute-0 systemd[1]: Started libpod-conmon-cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1.scope.
Dec 03 21:33:25 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:33:25 compute-0 podman[247546]: 2025-12-03 21:33:25.599232997 +0000 UTC m=+0.046923722 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:33:25 compute-0 podman[247546]: 2025-12-03 21:33:25.710953549 +0000 UTC m=+0.158644304 container init cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_keldysh, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:33:25 compute-0 podman[247546]: 2025-12-03 21:33:25.724278467 +0000 UTC m=+0.171969172 container start cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_keldysh, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:33:25 compute-0 podman[247546]: 2025-12-03 21:33:25.728159831 +0000 UTC m=+0.175850576 container attach cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_keldysh, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:33:25 compute-0 elated_keldysh[247562]: 167 167
Dec 03 21:33:25 compute-0 podman[247546]: 2025-12-03 21:33:25.731748527 +0000 UTC m=+0.179439272 container died cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_keldysh, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:33:25 compute-0 systemd[1]: libpod-cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1.scope: Deactivated successfully.
Dec 03 21:33:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-1bc2f77d000c211756d2810d0b5fa532ed97a0006e089ed97ab93352ca83e900-merged.mount: Deactivated successfully.
Dec 03 21:33:25 compute-0 podman[247546]: 2025-12-03 21:33:25.779052668 +0000 UTC m=+0.226743403 container remove cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=elated_keldysh, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 03 21:33:25 compute-0 systemd[1]: libpod-conmon-cf7b3fb4cb978ba73a83959408ae9245bf046e55cd6bbbc0933f2642e74f35d1.scope: Deactivated successfully.
Dec 03 21:33:26 compute-0 nova_compute[241566]: 2025-12-03 21:33:26.008 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:33:26 compute-0 podman[247586]: 2025-12-03 21:33:26.025748716 +0000 UTC m=+0.078614923 container create 1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_bouman, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:33:26 compute-0 systemd[1]: Started libpod-conmon-1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2.scope.
Dec 03 21:33:26 compute-0 podman[247586]: 2025-12-03 21:33:25.996291255 +0000 UTC m=+0.049157512 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:33:26 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:33:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a085c9bb03b19fdea05dd52b07174b47a64d4c74eeafcb5edd867d99b112b7d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a085c9bb03b19fdea05dd52b07174b47a64d4c74eeafcb5edd867d99b112b7d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a085c9bb03b19fdea05dd52b07174b47a64d4c74eeafcb5edd867d99b112b7d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a085c9bb03b19fdea05dd52b07174b47a64d4c74eeafcb5edd867d99b112b7d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:26 compute-0 podman[247586]: 2025-12-03 21:33:26.133914773 +0000 UTC m=+0.186781050 container init 1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_bouman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:33:26 compute-0 podman[247586]: 2025-12-03 21:33:26.145626247 +0000 UTC m=+0.198492414 container start 1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_bouman, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3)
Dec 03 21:33:26 compute-0 podman[247586]: 2025-12-03 21:33:26.14871866 +0000 UTC m=+0.201584837 container attach 1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_bouman, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:33:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v853: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:26 compute-0 frosty_bouman[247602]: {
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:     "0": [
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:         {
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "devices": [
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "/dev/loop3"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             ],
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_name": "ceph_lv0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_size": "21470642176",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "name": "ceph_lv0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "tags": {
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cluster_name": "ceph",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.crush_device_class": "",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.encrypted": "0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.objectstore": "bluestore",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osd_id": "0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.type": "block",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.vdo": "0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.with_tpm": "0"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             },
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "type": "block",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "vg_name": "ceph_vg0"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:         }
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:     ],
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:     "1": [
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:         {
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "devices": [
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "/dev/loop4"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             ],
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_name": "ceph_lv1",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_size": "21470642176",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "name": "ceph_lv1",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "tags": {
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cluster_name": "ceph",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.crush_device_class": "",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.encrypted": "0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.objectstore": "bluestore",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osd_id": "1",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.type": "block",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.vdo": "0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.with_tpm": "0"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             },
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "type": "block",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "vg_name": "ceph_vg1"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:         }
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:     ],
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:     "2": [
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:         {
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "devices": [
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "/dev/loop5"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             ],
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_name": "ceph_lv2",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_size": "21470642176",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "name": "ceph_lv2",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "tags": {
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.cluster_name": "ceph",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.crush_device_class": "",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.encrypted": "0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.objectstore": "bluestore",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osd_id": "2",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.type": "block",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.vdo": "0",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:                 "ceph.with_tpm": "0"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             },
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "type": "block",
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:             "vg_name": "ceph_vg2"
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:         }
Dec 03 21:33:26 compute-0 frosty_bouman[247602]:     ]
Dec 03 21:33:26 compute-0 frosty_bouman[247602]: }
Dec 03 21:33:26 compute-0 systemd[1]: libpod-1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2.scope: Deactivated successfully.
Dec 03 21:33:26 compute-0 podman[247586]: 2025-12-03 21:33:26.512306829 +0000 UTC m=+0.565173066 container died 1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_bouman, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:33:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-8a085c9bb03b19fdea05dd52b07174b47a64d4c74eeafcb5edd867d99b112b7d-merged.mount: Deactivated successfully.
Dec 03 21:33:26 compute-0 podman[247586]: 2025-12-03 21:33:26.571830998 +0000 UTC m=+0.624697205 container remove 1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=frosty_bouman, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:33:26 compute-0 systemd[1]: libpod-conmon-1045982a7db5002ffa6b211417a033e99f7ac4ea9120f967a516827ac61bb9a2.scope: Deactivated successfully.
Dec 03 21:33:26 compute-0 sudo[247509]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:26 compute-0 sudo[247622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:33:26 compute-0 sudo[247622]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:26 compute-0 sudo[247622]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:26 compute-0 sudo[247647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:33:26 compute-0 sudo[247647]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:27 compute-0 podman[247685]: 2025-12-03 21:33:27.113634325 +0000 UTC m=+0.061896834 container create 6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_rhodes, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:33:27 compute-0 systemd[1]: Started libpod-conmon-6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077.scope.
Dec 03 21:33:27 compute-0 podman[247685]: 2025-12-03 21:33:27.092309982 +0000 UTC m=+0.040572471 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:33:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:33:27 compute-0 podman[247685]: 2025-12-03 21:33:27.216391866 +0000 UTC m=+0.164654415 container init 6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_rhodes, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 03 21:33:27 compute-0 podman[247685]: 2025-12-03 21:33:27.224676728 +0000 UTC m=+0.172939237 container start 6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_rhodes, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:33:27 compute-0 sleepy_rhodes[247701]: 167 167
Dec 03 21:33:27 compute-0 podman[247685]: 2025-12-03 21:33:27.228665266 +0000 UTC m=+0.176927815 container attach 6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_rhodes, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle)
Dec 03 21:33:27 compute-0 systemd[1]: libpod-6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077.scope: Deactivated successfully.
Dec 03 21:33:27 compute-0 podman[247685]: 2025-12-03 21:33:27.229757035 +0000 UTC m=+0.178019564 container died 6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_rhodes, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:33:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-69f9a57bc1366843403f4948762dcbf12a4353d73daaa7487b37ba1d8e965499-merged.mount: Deactivated successfully.
Dec 03 21:33:27 compute-0 podman[247685]: 2025-12-03 21:33:27.266479461 +0000 UTC m=+0.214741940 container remove 6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sleepy_rhodes, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:33:27 compute-0 systemd[1]: libpod-conmon-6afe8d8cdd8d66d096998bed0356ce4ed976ba57f72e00ae3acf380c14d7d077.scope: Deactivated successfully.
Dec 03 21:33:27 compute-0 podman[247724]: 2025-12-03 21:33:27.444164246 +0000 UTC m=+0.038397373 container create 84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=trusting_bell, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:33:27 compute-0 systemd[1]: Started libpod-conmon-84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6.scope.
Dec 03 21:33:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:33:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01fa77910200371e1cfb1c650b7c576686bc56e773e383bb14561e2490e9c7cb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01fa77910200371e1cfb1c650b7c576686bc56e773e383bb14561e2490e9c7cb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01fa77910200371e1cfb1c650b7c576686bc56e773e383bb14561e2490e9c7cb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01fa77910200371e1cfb1c650b7c576686bc56e773e383bb14561e2490e9c7cb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:33:27 compute-0 podman[247724]: 2025-12-03 21:33:27.424840907 +0000 UTC m=+0.019074054 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:33:27 compute-0 podman[247724]: 2025-12-03 21:33:27.541988224 +0000 UTC m=+0.136221371 container init 84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=trusting_bell, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 03 21:33:27 compute-0 podman[247724]: 2025-12-03 21:33:27.553667648 +0000 UTC m=+0.147900785 container start 84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=trusting_bell, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:33:27 compute-0 podman[247724]: 2025-12-03 21:33:27.557254814 +0000 UTC m=+0.151487961 container attach 84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=trusting_bell, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 03 21:33:27 compute-0 ceph-mon[75204]: pgmap v853: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:33:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:33:28 compute-0 lvm[247820]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:33:28 compute-0 lvm[247820]: VG ceph_vg1 finished
Dec 03 21:33:28 compute-0 lvm[247817]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:33:28 compute-0 lvm[247817]: VG ceph_vg0 finished
Dec 03 21:33:28 compute-0 lvm[247821]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:33:28 compute-0 lvm[247821]: VG ceph_vg2 finished
Dec 03 21:33:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v854: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:28 compute-0 trusting_bell[247740]: {}
Dec 03 21:33:28 compute-0 systemd[1]: libpod-84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6.scope: Deactivated successfully.
Dec 03 21:33:28 compute-0 systemd[1]: libpod-84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6.scope: Consumed 1.331s CPU time.
Dec 03 21:33:28 compute-0 podman[247724]: 2025-12-03 21:33:28.393369228 +0000 UTC m=+0.987602385 container died 84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=trusting_bell, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:33:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-01fa77910200371e1cfb1c650b7c576686bc56e773e383bb14561e2490e9c7cb-merged.mount: Deactivated successfully.
Dec 03 21:33:28 compute-0 podman[247724]: 2025-12-03 21:33:28.443896586 +0000 UTC m=+1.038129733 container remove 84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=trusting_bell, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:33:28 compute-0 systemd[1]: libpod-conmon-84b021d7f0df9e55a8269a3eb005c7c077902ea30d619aaf42740d8497f254b6.scope: Deactivated successfully.
Dec 03 21:33:28 compute-0 sudo[247647]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:33:28 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:33:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:33:28 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:33:28 compute-0 sudo[247834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:33:28 compute-0 sudo[247834]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:33:28 compute-0 sudo[247834]: pam_unix(sudo:session): session closed for user root
Dec 03 21:33:29 compute-0 ceph-mon[75204]: pgmap v854: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:33:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:33:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v855: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:31 compute-0 ceph-mon[75204]: pgmap v855: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v856: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:33 compute-0 ceph-mon[75204]: pgmap v856: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v857: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:35 compute-0 ceph-mon[75204]: pgmap v857: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v858: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:37 compute-0 ceph-mon[75204]: pgmap v858: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:38 compute-0 podman[247860]: 2025-12-03 21:33:38.166746069 +0000 UTC m=+0.086016272 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:33:38 compute-0 podman[247859]: 2025-12-03 21:33:38.169448231 +0000 UTC m=+0.089911816 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 03 21:33:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v859: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:39 compute-0 ceph-mon[75204]: pgmap v859: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v860: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:41 compute-0 ceph-mon[75204]: pgmap v860: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v861: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:43 compute-0 ceph-mon[75204]: pgmap v861: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v862: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:45 compute-0 ceph-mon[75204]: pgmap v862: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v863: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:47 compute-0 ceph-mon[75204]: pgmap v863: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v864: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:33:48.937 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:33:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:33:48.938 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:33:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:33:48.938 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:33:50 compute-0 ceph-mon[75204]: pgmap v864: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v865: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:33:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:33:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:33:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:33:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:33:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:33:52 compute-0 ceph-mon[75204]: pgmap v865: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v866: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:54 compute-0 ceph-mon[75204]: pgmap v866: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:54 compute-0 podman[247897]: 2025-12-03 21:33:54.230320303 +0000 UTC m=+0.167209444 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:33:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v867: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:56 compute-0 ceph-mon[75204]: pgmap v867: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #39. Immutable memtables: 0.
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.364797) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 39
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797636364917, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1989, "num_deletes": 256, "total_data_size": 2085540, "memory_usage": 2135448, "flush_reason": "Manual Compaction"}
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #40: started
Dec 03 21:33:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v868: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797636378993, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 40, "file_size": 1461730, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15713, "largest_seqno": 17701, "table_properties": {"data_size": 1454049, "index_size": 4435, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2181, "raw_key_size": 17812, "raw_average_key_size": 20, "raw_value_size": 1437803, "raw_average_value_size": 1689, "num_data_blocks": 199, "num_entries": 851, "num_filter_entries": 851, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764797488, "oldest_key_time": 1764797488, "file_creation_time": 1764797636, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 40, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 14318 microseconds, and 8581 cpu microseconds.
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.379128) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #40: 1461730 bytes OK
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.379161) [db/memtable_list.cc:519] [default] Level-0 commit table #40 started
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.380734) [db/memtable_list.cc:722] [default] Level-0 commit table #40: memtable #1 done
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.380759) EVENT_LOG_v1 {"time_micros": 1764797636380753, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.380788) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 2077013, prev total WAL file size 2077013, number of live WAL files 2.
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000036.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.381951) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353032' seq:72057594037927935, type:22 .. '6D67727374617400373533' seq:0, type:0; will stop at (end)
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [40(1427KB)], [38(5681KB)]
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797636382040, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [40], "files_L6": [38], "score": -1, "input_data_size": 7279717, "oldest_snapshot_seqno": -1}
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #41: 3927 keys, 5741631 bytes, temperature: kUnknown
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797636431003, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 41, "file_size": 5741631, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5713727, "index_size": 16978, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9861, "raw_key_size": 92302, "raw_average_key_size": 23, "raw_value_size": 5641511, "raw_average_value_size": 1436, "num_data_blocks": 731, "num_entries": 3927, "num_filter_entries": 3927, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764797636, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.431434) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 5741631 bytes
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.433108) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 148.3 rd, 117.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 5.5 +0.0 blob) out(5.5 +0.0 blob), read-write-amplify(8.9) write-amplify(3.9) OK, records in: 4381, records dropped: 454 output_compression: NoCompression
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.433138) EVENT_LOG_v1 {"time_micros": 1764797636433123, "job": 18, "event": "compaction_finished", "compaction_time_micros": 49083, "compaction_time_cpu_micros": 29663, "output_level": 6, "num_output_files": 1, "total_output_size": 5741631, "num_input_records": 4381, "num_output_records": 3927, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000040.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797636433843, "job": 18, "event": "table_file_deletion", "file_number": 40}
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797636435895, "job": 18, "event": "table_file_deletion", "file_number": 38}
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.381802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.435950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.435957) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.435960) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.435963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:33:56 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:33:56.435967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:33:57 compute-0 ceph-mon[75204]: pgmap v868: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v869: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:59 compute-0 ceph-mon[75204]: pgmap v869: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:33:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:33:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2237149326' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:33:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:33:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2237149326' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:34:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v870: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/2237149326' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:34:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/2237149326' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:34:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:01 compute-0 ceph-mon[75204]: pgmap v870: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v871: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:03 compute-0 ceph-mon[75204]: pgmap v871: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v872: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:04 compute-0 sshd-session[247923]: Accepted publickey for zuul from 192.168.122.10 port 34784 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:34:04 compute-0 systemd-logind[787]: New session 51 of user zuul.
Dec 03 21:34:04 compute-0 systemd[1]: Started Session 51 of User zuul.
Dec 03 21:34:04 compute-0 sshd-session[247923]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:34:04 compute-0 sudo[247927]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 03 21:34:04 compute-0 sudo[247927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:34:05 compute-0 ceph-mon[75204]: pgmap v872: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v873: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:07 compute-0 ceph-mon[75204]: pgmap v873: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:07 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14696 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:08 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14698 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v874: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:08 compute-0 ceph-mon[75204]: from='client.14696 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:08 compute-0 ceph-mon[75204]: from='client.14698 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0)
Dec 03 21:34:08 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3661501632' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 03 21:34:09 compute-0 podman[248152]: 2025-12-03 21:34:09.14241228 +0000 UTC m=+0.078901992 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 03 21:34:09 compute-0 podman[248153]: 2025-12-03 21:34:09.149554001 +0000 UTC m=+0.077936395 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:34:09 compute-0 ceph-mon[75204]: pgmap v874: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:09 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3661501632' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 03 21:34:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v875: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:11 compute-0 ceph-mon[75204]: pgmap v875: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v876: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:13 compute-0 ceph-mon[75204]: pgmap v876: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v877: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:15 compute-0 ceph-mon[75204]: pgmap v877: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v878: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:18 compute-0 ceph-mon[75204]: pgmap v878: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v879: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:19 compute-0 nova_compute[241566]: 2025-12-03 21:34:19.567 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:20 compute-0 ceph-mon[75204]: pgmap v879: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:20 compute-0 ceph-osd[86059]: bluestore.MempoolThread fragmentation_score=0.000120 took=0.000018s
Dec 03 21:34:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v880: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:20 compute-0 ceph-osd[88129]: bluestore.MempoolThread fragmentation_score=0.000147 took=0.000030s
Dec 03 21:34:20 compute-0 ceph-osd[87094]: bluestore.MempoolThread fragmentation_score=0.000123 took=0.000012s
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:34:21
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'images', 'backups', 'volumes', 'vms', '.mgr', 'cephfs.cephfs.data']
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:34:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:21 compute-0 nova_compute[241566]: 2025-12-03 21:34:21.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:21 compute-0 nova_compute[241566]: 2025-12-03 21:34:21.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:34:21 compute-0 nova_compute[241566]: 2025-12-03 21:34:21.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:34:21 compute-0 nova_compute[241566]: 2025-12-03 21:34:21.647 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:34:21 compute-0 nova_compute[241566]: 2025-12-03 21:34:21.648 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:21 compute-0 nova_compute[241566]: 2025-12-03 21:34:21.648 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:34:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:34:22 compute-0 ceph-mon[75204]: pgmap v880: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v881: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:22 compute-0 nova_compute[241566]: 2025-12-03 21:34:22.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:22 compute-0 nova_compute[241566]: 2025-12-03 21:34:22.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:22 compute-0 nova_compute[241566]: 2025-12-03 21:34:22.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:34:23 compute-0 nova_compute[241566]: 2025-12-03 21:34:23.550 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:23 compute-0 nova_compute[241566]: 2025-12-03 21:34:23.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:23 compute-0 nova_compute[241566]: 2025-12-03 21:34:23.577 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:34:23 compute-0 nova_compute[241566]: 2025-12-03 21:34:23.577 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:34:23 compute-0 nova_compute[241566]: 2025-12-03 21:34:23.578 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:34:23 compute-0 nova_compute[241566]: 2025-12-03 21:34:23.578 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:34:23 compute-0 nova_compute[241566]: 2025-12-03 21:34:23.578 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:34:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:34:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/201796836' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:34:24 compute-0 ceph-mon[75204]: pgmap v881: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:24 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/201796836' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.154 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.575s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.357 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.358 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5075MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.359 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.359 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:34:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v882: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.445 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.446 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.468 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:34:24 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:34:24 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4091558153' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.968 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.975 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.993 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.996 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:34:24 compute-0 nova_compute[241566]: 2025-12-03 21:34:24.996 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:34:25 compute-0 podman[248259]: 2025-12-03 21:34:25.137438122 +0000 UTC m=+0.148302035 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 21:34:25 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4091558153' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:34:26 compute-0 nova_compute[241566]: 2025-12-03 21:34:25.998 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:34:26 compute-0 ceph-mon[75204]: pgmap v882: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v883: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:34:28 compute-0 ceph-mon[75204]: pgmap v883: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v884: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:28 compute-0 sudo[248311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:34:28 compute-0 sudo[248311]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:28 compute-0 sudo[248311]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:28 compute-0 sudo[248336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:34:28 compute-0 sudo[248336]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:29 compute-0 ovs-vsctl[248404]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 03 21:34:29 compute-0 sudo[248336]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:34:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:34:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:34:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #42. Immutable memtables: 0.
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.635330) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 42
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797669635402, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 512, "num_deletes": 251, "total_data_size": 328859, "memory_usage": 339368, "flush_reason": "Manual Compaction"}
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #43: started
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797669640391, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 43, "file_size": 324122, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17702, "largest_seqno": 18213, "table_properties": {"data_size": 321332, "index_size": 826, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6647, "raw_average_key_size": 18, "raw_value_size": 315778, "raw_average_value_size": 892, "num_data_blocks": 38, "num_entries": 354, "num_filter_entries": 354, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764797637, "oldest_key_time": 1764797637, "file_creation_time": 1764797669, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 43, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 5093 microseconds, and 1827 cpu microseconds.
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.640432) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #43: 324122 bytes OK
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.640452) [db/memtable_list.cc:519] [default] Level-0 commit table #43 started
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.641691) [db/memtable_list.cc:722] [default] Level-0 commit table #43: memtable #1 done
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.641707) EVENT_LOG_v1 {"time_micros": 1764797669641702, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.641727) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 325912, prev total WAL file size 325912, number of live WAL files 2.
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000039.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.642291) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [43(316KB)], [41(5607KB)]
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797669642351, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [43], "files_L6": [41], "score": -1, "input_data_size": 6065753, "oldest_snapshot_seqno": -1}
Dec 03 21:34:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:34:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:34:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:34:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:34:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:34:29 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:34:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:34:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:34:29 compute-0 sudo[248456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:34:29 compute-0 sudo[248456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:29 compute-0 sudo[248456]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #44: 3772 keys, 4891771 bytes, temperature: kUnknown
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797669741044, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 44, "file_size": 4891771, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 4866124, "index_size": 15106, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9477, "raw_key_size": 89730, "raw_average_key_size": 23, "raw_value_size": 4797795, "raw_average_value_size": 1271, "num_data_blocks": 644, "num_entries": 3772, "num_filter_entries": 3772, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764797669, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.741598) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 4891771 bytes
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.746144) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 61.3 rd, 49.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 5.5 +0.0 blob) out(4.7 +0.0 blob), read-write-amplify(33.8) write-amplify(15.1) OK, records in: 4281, records dropped: 509 output_compression: NoCompression
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.746163) EVENT_LOG_v1 {"time_micros": 1764797669746152, "job": 20, "event": "compaction_finished", "compaction_time_micros": 98982, "compaction_time_cpu_micros": 12616, "output_level": 6, "num_output_files": 1, "total_output_size": 4891771, "num_input_records": 4281, "num_output_records": 3772, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000043.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797669746596, "job": 20, "event": "table_file_deletion", "file_number": 43}
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797669748044, "job": 20, "event": "table_file_deletion", "file_number": 41}
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.642230) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.748082) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.748088) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.748090) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.748092) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:34:29 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:34:29.748094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:34:29 compute-0 ceph-mon[75204]: pgmap v884: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:34:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:34:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:34:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:34:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:34:29 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:34:29 compute-0 sudo[248481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:34:29 compute-0 sudo[248481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:30 compute-0 podman[248545]: 2025-12-03 21:34:30.13600977 +0000 UTC m=+0.066964680 container create c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_lewin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:34:30 compute-0 systemd[1]: Started libpod-conmon-c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe.scope.
Dec 03 21:34:30 compute-0 podman[248545]: 2025-12-03 21:34:30.10698383 +0000 UTC m=+0.037938790 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:34:30 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:34:30 compute-0 podman[248545]: 2025-12-03 21:34:30.232932764 +0000 UTC m=+0.163887714 container init c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_lewin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:34:30 compute-0 podman[248545]: 2025-12-03 21:34:30.241288658 +0000 UTC m=+0.172243528 container start c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_lewin, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:34:30 compute-0 podman[248545]: 2025-12-03 21:34:30.2454374 +0000 UTC m=+0.176392280 container attach c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_lewin, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:34:30 compute-0 cranky_lewin[248615]: 167 167
Dec 03 21:34:30 compute-0 virtqemud[241184]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 03 21:34:30 compute-0 systemd[1]: libpod-c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe.scope: Deactivated successfully.
Dec 03 21:34:30 compute-0 podman[248545]: 2025-12-03 21:34:30.2890074 +0000 UTC m=+0.219962290 container died c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_lewin, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:34:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-35717595093e0aa56daa12735f5fbc804dca7256a8282693b66e33e3d4c5901c-merged.mount: Deactivated successfully.
Dec 03 21:34:30 compute-0 virtqemud[241184]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 03 21:34:30 compute-0 podman[248545]: 2025-12-03 21:34:30.333803664 +0000 UTC m=+0.264758554 container remove c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cranky_lewin, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:34:30 compute-0 systemd[1]: libpod-conmon-c786748fb59db54b3aee60dee873e16a5cb66fdd6575ed46dd4ce1b33ae3e9fe.scope: Deactivated successfully.
Dec 03 21:34:30 compute-0 virtqemud[241184]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 21:34:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v885: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:30 compute-0 podman[248662]: 2025-12-03 21:34:30.579364082 +0000 UTC m=+0.066852248 container create a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_dubinsky, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:34:30 compute-0 systemd[1]: Started libpod-conmon-a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa.scope.
Dec 03 21:34:30 compute-0 podman[248662]: 2025-12-03 21:34:30.554673068 +0000 UTC m=+0.042161264 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:34:30 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:34:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac92f578ffd9f553ae45ec60f54c8a3c7f9e25e8ec2b0a65646bce4c645c3e9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac92f578ffd9f553ae45ec60f54c8a3c7f9e25e8ec2b0a65646bce4c645c3e9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac92f578ffd9f553ae45ec60f54c8a3c7f9e25e8ec2b0a65646bce4c645c3e9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac92f578ffd9f553ae45ec60f54c8a3c7f9e25e8ec2b0a65646bce4c645c3e9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ac92f578ffd9f553ae45ec60f54c8a3c7f9e25e8ec2b0a65646bce4c645c3e9/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:30 compute-0 podman[248662]: 2025-12-03 21:34:30.710205457 +0000 UTC m=+0.197693653 container init a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_dubinsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:34:30 compute-0 podman[248662]: 2025-12-03 21:34:30.722131937 +0000 UTC m=+0.209620133 container start a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_dubinsky, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:34:30 compute-0 podman[248662]: 2025-12-03 21:34:30.726491424 +0000 UTC m=+0.213979620 container attach a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_dubinsky, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:34:30 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: cache status {prefix=cache status} (starting...)
Dec 03 21:34:31 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: client ls {prefix=client ls} (starting...)
Dec 03 21:34:31 compute-0 pensive_dubinsky[248714]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:34:31 compute-0 pensive_dubinsky[248714]: --> All data devices are unavailable
Dec 03 21:34:31 compute-0 systemd[1]: libpod-a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa.scope: Deactivated successfully.
Dec 03 21:34:31 compute-0 podman[248662]: 2025-12-03 21:34:31.295506103 +0000 UTC m=+0.782994279 container died a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_dubinsky, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:34:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-1ac92f578ffd9f553ae45ec60f54c8a3c7f9e25e8ec2b0a65646bce4c645c3e9-merged.mount: Deactivated successfully.
Dec 03 21:34:31 compute-0 podman[248662]: 2025-12-03 21:34:31.34083958 +0000 UTC m=+0.828327736 container remove a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pensive_dubinsky, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS)
Dec 03 21:34:31 compute-0 systemd[1]: libpod-conmon-a351f01f75480eec117665002a5d2f6893f42de06caa923cf0f263ead9300afa.scope: Deactivated successfully.
Dec 03 21:34:31 compute-0 lvm[248920]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:34:31 compute-0 lvm[248920]: VG ceph_vg2 finished
Dec 03 21:34:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:31 compute-0 sudo[248481]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:31 compute-0 lvm[248932]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:34:31 compute-0 lvm[248932]: VG ceph_vg1 finished
Dec 03 21:34:31 compute-0 lvm[248947]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:34:31 compute-0 lvm[248947]: VG ceph_vg0 finished
Dec 03 21:34:31 compute-0 sudo[248937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:34:31 compute-0 sudo[248937]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:31 compute-0 sudo[248937]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:31 compute-0 sudo[248972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:34:31 compute-0 sudo[248972]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:31 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14706 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:31 compute-0 ceph-mon[75204]: pgmap v885: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:31 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: damage ls {prefix=damage ls} (starting...)
Dec 03 21:34:31 compute-0 podman[249032]: 2025-12-03 21:34:31.830984249 +0000 UTC m=+0.055738388 container create 8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_shannon, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:34:31 compute-0 systemd[1]: Started libpod-conmon-8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227.scope.
Dec 03 21:34:31 compute-0 podman[249032]: 2025-12-03 21:34:31.800988023 +0000 UTC m=+0.025742192 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:34:31 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:34:31 compute-0 podman[249032]: 2025-12-03 21:34:31.914126352 +0000 UTC m=+0.138880481 container init 8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_shannon, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:34:31 compute-0 podman[249032]: 2025-12-03 21:34:31.921595373 +0000 UTC m=+0.146349512 container start 8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_shannon, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:34:31 compute-0 podman[249032]: 2025-12-03 21:34:31.924549713 +0000 UTC m=+0.149303822 container attach 8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_shannon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:34:31 compute-0 affectionate_shannon[249076]: 167 167
Dec 03 21:34:31 compute-0 systemd[1]: libpod-8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227.scope: Deactivated successfully.
Dec 03 21:34:31 compute-0 podman[249032]: 2025-12-03 21:34:31.928662024 +0000 UTC m=+0.153416163 container died 8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_shannon, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:34:31 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump loads {prefix=dump loads} (starting...)
Dec 03 21:34:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-c294490d04a6e536c725b6265712c02c411952d1b786dce2ee6c87c06e7e59c6-merged.mount: Deactivated successfully.
Dec 03 21:34:31 compute-0 podman[249032]: 2025-12-03 21:34:31.966689765 +0000 UTC m=+0.191443914 container remove 8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=affectionate_shannon, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:34:31 compute-0 systemd[1]: libpod-conmon-8dc3c05de104816f63a71920d4372e1a8b4de4354b790278ca34b5c19869c227.scope: Deactivated successfully.
Dec 03 21:34:32 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 03 21:34:32 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14708 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:32 compute-0 podman[249125]: 2025-12-03 21:34:32.145268953 +0000 UTC m=+0.044247780 container create 3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_hofstadter, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 03 21:34:32 compute-0 systemd[1]: Started libpod-conmon-3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd.scope.
Dec 03 21:34:32 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:34:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94b466ca325c8b58e4a11da9187f605a6ecf99a8e073d78ae34ac0df365f9ebb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94b466ca325c8b58e4a11da9187f605a6ecf99a8e073d78ae34ac0df365f9ebb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94b466ca325c8b58e4a11da9187f605a6ecf99a8e073d78ae34ac0df365f9ebb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/94b466ca325c8b58e4a11da9187f605a6ecf99a8e073d78ae34ac0df365f9ebb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:32 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 03 21:34:32 compute-0 podman[249125]: 2025-12-03 21:34:32.213158437 +0000 UTC m=+0.112137274 container init 3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_hofstadter, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:34:32 compute-0 podman[249125]: 2025-12-03 21:34:32.221676906 +0000 UTC m=+0.120655753 container start 3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_hofstadter, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:34:32 compute-0 podman[249125]: 2025-12-03 21:34:32.224791939 +0000 UTC m=+0.123770766 container attach 3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_hofstadter, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:34:32 compute-0 podman[249125]: 2025-12-03 21:34:32.131423381 +0000 UTC m=+0.030402228 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:34:32 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 03 21:34:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v886: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:32 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]: {
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:     "0": [
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:         {
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "devices": [
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "/dev/loop3"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             ],
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_name": "ceph_lv0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_size": "21470642176",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "name": "ceph_lv0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "tags": {
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cluster_name": "ceph",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.crush_device_class": "",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.encrypted": "0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.objectstore": "bluestore",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osd_id": "0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.type": "block",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.vdo": "0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.with_tpm": "0"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             },
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "type": "block",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "vg_name": "ceph_vg0"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:         }
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:     ],
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:     "1": [
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:         {
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "devices": [
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "/dev/loop4"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             ],
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_name": "ceph_lv1",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_size": "21470642176",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "name": "ceph_lv1",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "tags": {
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cluster_name": "ceph",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.crush_device_class": "",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.encrypted": "0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.objectstore": "bluestore",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osd_id": "1",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.type": "block",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.vdo": "0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.with_tpm": "0"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             },
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "type": "block",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "vg_name": "ceph_vg1"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:         }
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:     ],
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:     "2": [
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:         {
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "devices": [
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "/dev/loop5"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             ],
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_name": "ceph_lv2",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_size": "21470642176",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "name": "ceph_lv2",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "tags": {
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.cluster_name": "ceph",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.crush_device_class": "",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.encrypted": "0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.objectstore": "bluestore",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osd_id": "2",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.type": "block",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.vdo": "0",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:                 "ceph.with_tpm": "0"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             },
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "type": "block",
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:             "vg_name": "ceph_vg2"
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:         }
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]:     ]
Dec 03 21:34:32 compute-0 cool_hofstadter[249150]: }
Dec 03 21:34:32 compute-0 systemd[1]: libpod-3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd.scope: Deactivated successfully.
Dec 03 21:34:32 compute-0 podman[249125]: 2025-12-03 21:34:32.530267676 +0000 UTC m=+0.429246503 container died 3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_hofstadter, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:34:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-94b466ca325c8b58e4a11da9187f605a6ecf99a8e073d78ae34ac0df365f9ebb-merged.mount: Deactivated successfully.
Dec 03 21:34:32 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14712 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "report"} v 0)
Dec 03 21:34:32 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1058758739' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 03 21:34:32 compute-0 podman[249125]: 2025-12-03 21:34:32.673365561 +0000 UTC m=+0.572344388 container remove 3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_hofstadter, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:34:32 compute-0 systemd[1]: libpod-conmon-3ab8d62106fcfd8e54573db20c20d964ab9ff604d4873f32a2199735cd9a9abd.scope: Deactivated successfully.
Dec 03 21:34:32 compute-0 sudo[248972]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:32 compute-0 ceph-mon[75204]: from='client.14706 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:32 compute-0 ceph-mon[75204]: from='client.14708 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:32 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1058758739' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 03 21:34:32 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 03 21:34:32 compute-0 sudo[249226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:34:32 compute-0 sudo[249226]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:32 compute-0 sudo[249226]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:32 compute-0 sudo[249290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:34:32 compute-0 sudo[249290]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:32 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 03 21:34:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:34:33 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3946531935' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:34:33 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14714 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:33 compute-0 ceph-mgr[75500]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 03 21:34:33 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: 2025-12-03T21:34:33.124+0000 7fa8c63a3640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 03 21:34:33 compute-0 podman[249336]: 2025-12-03 21:34:33.168275828 +0000 UTC m=+0.045012800 container create af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_almeida, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:34:33 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: ops {prefix=ops} (starting...)
Dec 03 21:34:33 compute-0 systemd[1]: Started libpod-conmon-af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a.scope.
Dec 03 21:34:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:34:33 compute-0 podman[249336]: 2025-12-03 21:34:33.23646369 +0000 UTC m=+0.113200702 container init af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_almeida, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030)
Dec 03 21:34:33 compute-0 podman[249336]: 2025-12-03 21:34:33.245462642 +0000 UTC m=+0.122199614 container start af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_almeida, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 03 21:34:33 compute-0 laughing_almeida[249361]: 167 167
Dec 03 21:34:33 compute-0 systemd[1]: libpod-af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a.scope: Deactivated successfully.
Dec 03 21:34:33 compute-0 podman[249336]: 2025-12-03 21:34:33.249784248 +0000 UTC m=+0.126521240 container attach af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_almeida, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:34:33 compute-0 podman[249336]: 2025-12-03 21:34:33.153932963 +0000 UTC m=+0.030669935 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:34:33 compute-0 podman[249336]: 2025-12-03 21:34:33.250113967 +0000 UTC m=+0.126850939 container died af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_almeida, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:34:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-47cb701e3e2ed7a43d276497449a8516bfefdb3e64476e42e85602c8c692f4e5-merged.mount: Deactivated successfully.
Dec 03 21:34:33 compute-0 podman[249336]: 2025-12-03 21:34:33.291327154 +0000 UTC m=+0.168064126 container remove af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=laughing_almeida, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:34:33 compute-0 systemd[1]: libpod-conmon-af869a01dbbcf8e94b3281499498c66ee35219dee089e90b6879398d8f478e4a.scope: Deactivated successfully.
Dec 03 21:34:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config log"} v 0)
Dec 03 21:34:33 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3377945510' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 03 21:34:33 compute-0 podman[249427]: 2025-12-03 21:34:33.542430581 +0000 UTC m=+0.075979793 container create 952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_greider, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:34:33 compute-0 systemd[1]: Started libpod-conmon-952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4.scope.
Dec 03 21:34:33 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:34:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f2b8e759a8684c8020bca8dbbfa8b5421cebe0c0b8a68e8c7bb70f742afed81/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:33 compute-0 podman[249427]: 2025-12-03 21:34:33.517714257 +0000 UTC m=+0.051263549 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:34:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f2b8e759a8684c8020bca8dbbfa8b5421cebe0c0b8a68e8c7bb70f742afed81/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f2b8e759a8684c8020bca8dbbfa8b5421cebe0c0b8a68e8c7bb70f742afed81/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f2b8e759a8684c8020bca8dbbfa8b5421cebe0c0b8a68e8c7bb70f742afed81/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:34:33 compute-0 podman[249427]: 2025-12-03 21:34:33.625099072 +0000 UTC m=+0.158648324 container init 952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_greider, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:34:33 compute-0 podman[249427]: 2025-12-03 21:34:33.630218629 +0000 UTC m=+0.163767831 container start 952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_greider, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:34:33 compute-0 podman[249427]: 2025-12-03 21:34:33.658027187 +0000 UTC m=+0.191576429 container attach 952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_greider, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:34:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 03 21:34:33 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4204876595' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 03 21:34:33 compute-0 ceph-mon[75204]: pgmap v886: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:33 compute-0 ceph-mon[75204]: from='client.14712 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:33 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3946531935' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:34:33 compute-0 ceph-mon[75204]: from='client.14714 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:33 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3377945510' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 03 21:34:33 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4204876595' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 03 21:34:33 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: session ls {prefix=session ls} (starting...)
Dec 03 21:34:33 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: status {prefix=status} (starting...)
Dec 03 21:34:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 03 21:34:34 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2722872596' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 03 21:34:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 03 21:34:34 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/377887505' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 03 21:34:34 compute-0 lvm[249591]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:34:34 compute-0 lvm[249593]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:34:34 compute-0 lvm[249593]: VG ceph_vg1 finished
Dec 03 21:34:34 compute-0 lvm[249591]: VG ceph_vg0 finished
Dec 03 21:34:34 compute-0 lvm[249595]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:34:34 compute-0 lvm[249595]: VG ceph_vg2 finished
Dec 03 21:34:34 compute-0 lvm[249620]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:34:34 compute-0 lvm[249620]: VG ceph_vg2 finished
Dec 03 21:34:34 compute-0 stupefied_greider[249449]: {}
Dec 03 21:34:34 compute-0 systemd[1]: libpod-952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4.scope: Deactivated successfully.
Dec 03 21:34:34 compute-0 systemd[1]: libpod-952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4.scope: Consumed 1.203s CPU time.
Dec 03 21:34:34 compute-0 podman[249427]: 2025-12-03 21:34:34.370640582 +0000 UTC m=+0.904189814 container died 952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_greider, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 03 21:34:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v887: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-0f2b8e759a8684c8020bca8dbbfa8b5421cebe0c0b8a68e8c7bb70f742afed81-merged.mount: Deactivated successfully.
Dec 03 21:34:34 compute-0 podman[249427]: 2025-12-03 21:34:34.413072072 +0000 UTC m=+0.946621274 container remove 952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_greider, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, ceph=True, OSD_FLAVOR=default)
Dec 03 21:34:34 compute-0 systemd[1]: libpod-conmon-952d7e715170c42b5bade2ce358d31e1acd104754b75d8637601d42c192f28b4.scope: Deactivated successfully.
Dec 03 21:34:34 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:34:34 compute-0 sudo[249290]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:34:34 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:34:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:34:34 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:34:34 compute-0 sudo[249659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:34:34 compute-0 sudo[249659]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:34:34 compute-0 sudo[249659]: pam_unix(sudo:session): session closed for user root
Dec 03 21:34:34 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14726 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:34 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 03 21:34:34 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3227831888' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 03 21:34:34 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2722872596' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 03 21:34:34 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/377887505' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 03 21:34:34 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:34:34 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:34:34 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3227831888' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 03 21:34:35 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14730 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 03 21:34:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/486783745' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 03 21:34:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "features"} v 0)
Dec 03 21:34:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3163567038' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 03 21:34:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 03 21:34:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3009857062' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:34:35 compute-0 ceph-mon[75204]: pgmap v887: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:35 compute-0 ceph-mon[75204]: from='client.14726 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:35 compute-0 ceph-mon[75204]: from='client.14730 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:35 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/486783745' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 03 21:34:35 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3163567038' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 03 21:34:35 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3009857062' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:34:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 03 21:34:36 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/107846304' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 03 21:34:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 03 21:34:36 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3980611023' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 03 21:34:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v888: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:36 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14742 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:36 compute-0 ceph-mgr[75500]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec 03 21:34:36 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: 2025-12-03T21:34:36.674+0000 7fa8c63a3640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec 03 21:34:36 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/107846304' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 03 21:34:36 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3980611023' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 03 21:34:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 03 21:34:36 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4143601676' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 03 21:34:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 03 21:34:37 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2866941577' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 03 21:34:37 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14748 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.810041 4 0.000108
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000020 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000063 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000024 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000056 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.810186 4 0.000042
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.810735 4 0.000043
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.809542 4 0.000026
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000015 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000018 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.809662 4 0.000027
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000208 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000013 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000034 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000041 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000037 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000085 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000042 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000301 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000023 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000023 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.809864 4 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000069 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000109 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.810131 4 0.000026
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.809629 4 0.000031
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000075 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.810149 4 0.000085
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000014 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000120 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000016 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.809492 4 0.000028
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000062 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000016 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000014 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000018 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.809598 4 0.000064
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000019 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000020 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.809410 4 0.000028
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000071 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000027 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000029 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000050 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000098 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000046 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000118 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000055 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000021 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.000109 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=19/20 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002378 3 0.000239
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002410 3 0.000191
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.10( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.17( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.8( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.0( empty local-lis/les=37/38 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005498 3 0.000263
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005460 3 0.000103
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.10( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005483 3 0.000160
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005351 3 0.000106
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.10( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.10( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000053 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.10( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005468 3 0.000229
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005564 3 0.000154
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=19/19 les/c/f=20/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005525 3 0.000164
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005354 3 0.000140
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.17( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005378 3 0.000207
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.17( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.17( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005408 3 0.000184
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.17( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005259 3 0.000123
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005262 3 0.000414
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.8( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005374 3 0.000294
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.8( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.8( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.8( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.0( empty local-lis/les=37/38 n=0 ec=19/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005254 3 0.000055
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005212 3 0.000150
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.0( empty local-lis/les=37/38 n=0 ec=19/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.0( empty local-lis/les=37/38 n=0 ec=19/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.0( empty local-lis/les=37/38 n=0 ec=19/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005248 3 0.000148
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005120 3 0.000127
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005119 3 0.000121
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005014 3 0.000101
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004964 3 0.000146
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005005 3 0.000171
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004998 3 0.000420
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.6( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005003 3 0.000214
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004896 3 0.000151
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004770 3 0.000187
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000042 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004908 3 0.000190
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004842 3 0.000109
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004828 3 0.000196
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004752 3 0.000246
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004704 3 0.000434
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.1b( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 5)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:27.478458+0000 osd.2 (osd.2) 4 : cluster [DBG] 2.1d scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:27.489009+0000 osd.2 (osd.2) 5 : cluster [DBG] 2.1d scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.001439 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 38 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/19 les/c/f=38/20/0 sis=37) [2] r=0 lpr=37 pi=[19,37)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:09:59.193604+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59006976 unmapped: 1802240 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 314347 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 38 handle_osd_map epochs [38,39], i have 38, src has [1,39]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 39 heartbeat osd_stat(store_statfs(0x4fe168000/0x0/0x4ffc00000, data 0x26b89/0x62000, compress 0x0/0x0/0x0, omap 0x45ed, meta 0x1a2ba13), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:00.193848+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 7 sent 5 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:29.487879+0000 osd.2 (osd.2) 6 : cluster [DBG] 2.1e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:29.498500+0000 osd.2 (osd.2) 7 : cluster [DBG] 2.1e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59047936 unmapped: 1761280 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 7)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:29.487879+0000 osd.2 (osd.2) 6 : cluster [DBG] 2.1e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:29.498500+0000 osd.2 (osd.2) 7 : cluster [DBG] 2.1e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:01.194140+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59129856 unmapped: 1679360 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 39 handle_osd_map epochs [40,40], i have 39, src has [1,40]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:02.194288+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 9 sent 7 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:31.532086+0000 osd.2 (osd.2) 8 : cluster [DBG] 2.b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:31.542609+0000 osd.2 (osd.2) 9 : cluster [DBG] 2.b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 1622016 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 9)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:31.532086+0000 osd.2 (osd.2) 8 : cluster [DBG] 2.b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:31.542609+0000 osd.2 (osd.2) 9 : cluster [DBG] 2.b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:03.194528+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59187200 unmapped: 1622016 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 40 heartbeat osd_stat(store_statfs(0x4fe162000/0x0/0x4ffc00000, data 0x295f3/0x68000, compress 0x0/0x0/0x0, omap 0x4b03, meta 0x1a2b4fd), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:04.194776+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 1581056 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 324715 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:05.195030+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59228160 unmapped: 1581056 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:06.195334+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59219968 unmapped: 1589248 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:07.195506+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.410965919s of 14.511530876s, submitted: 209
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 59260928 unmapped: 1548288 heap: 60809216 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 40 heartbeat osd_stat(store_statfs(0x4fe162000/0x0/0x4ffc00000, data 0x295f3/0x68000, compress 0x0/0x0/0x0, omap 0x4b03, meta 0x1a2b4fd), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 40 handle_osd_map epochs [41,41], i have 40, src has [1,41]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000101 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000026
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000152 1 0.000041
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000079 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000026
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000107 1 0.000033
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000086 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000090 1 0.000056
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000075 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000018
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000053 1 0.000041
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000043 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000012
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000073 1 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000075 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000039
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000055 1 0.000034
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000052 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000015
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000073 1 0.000035
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000056 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000018 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000065 1 0.000033
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000071 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000048 1 0.000032
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=0/0 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.115552 8 0.000099
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.120882 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.121216 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.121254 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119262 8 0.000113
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.121750 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.121835 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884406090s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348602295s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.121877 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884387016s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348602295s@ mbc={}] exit Reset 0.000063 1 0.000104
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884387016s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348602295s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880822182s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.345054626s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884387016s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348602295s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884387016s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348602295s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884387016s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348602295s@ mbc={}] exit Start 0.000016 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.884387016s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348602295s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880803108s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.345054626s@ mbc={}] exit Reset 0.000054 1 0.000081
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880803108s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.345054626s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880803108s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.345054626s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880803108s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.345054626s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880803108s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.345054626s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880803108s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.345054626s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.146308 20 0.000142
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.152461 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.152568 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.152598 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853688240s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.318038940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853674889s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] exit Reset 0.000028 1 0.000049
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853674889s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853674889s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853674889s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853674889s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853674889s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.146129 20 0.000180
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.152608 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.152722 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.152793 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853358269s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.318038940s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853232384s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] exit Reset 0.000201 1 0.000683
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853232384s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853232384s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853232384s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853232384s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] exit Start 0.000109 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.853232384s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.318038940s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.147707 20 0.000121
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.153889 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.153949 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.153976 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852234840s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317932129s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852218628s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317932129s@ mbc={}] exit Reset 0.000038 1 0.000066
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852218628s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317932129s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852218628s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317932129s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852218628s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317932129s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852218628s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317932129s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.852218628s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317932129s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.004026 2 0.000098
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000056 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000055 1 0.000027
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.148126 20 0.000089
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.155164 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.155232 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.155257 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.851003647s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317916870s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850990295s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317916870s@ mbc={}] exit Reset 0.000026 1 0.000043
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850990295s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317916870s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850990295s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317916870s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850990295s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317916870s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850990295s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317916870s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850990295s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317916870s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.149078 20 0.000091
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.155274 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.155377 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.155407 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.118720 8 0.000051
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.124235 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.124365 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.124406 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881216049s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348243713s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881206512s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348243713s@ mbc={}] exit Reset 0.000021 1 0.000039
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881206512s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348243713s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881206512s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348243713s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881206512s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348243713s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881206512s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348243713s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850802422s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317840576s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.881206512s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348243713s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850778580s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317840576s@ mbc={}] exit Reset 0.000069 1 0.000111
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850778580s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317840576s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.149121 20 0.000290
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.155552 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850778580s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317840576s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.155607 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850778580s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317840576s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.155628 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850778580s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317840576s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850778580s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317840576s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 41 handle_osd_map epochs [41,41], i have 41, src has [1,41]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850716591s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317825317s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119094 8 0.000067
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.124591 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.124651 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.124677 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850179672s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] exit Reset 0.000552 1 0.000564
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850179672s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850179672s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850179672s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850179672s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850179672s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119291 8 0.000082
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.124905 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880301476s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348007202s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.124999 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.149840 20 0.000082
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.156047 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.156240 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125034 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.156287 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880270004s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348007202s@ mbc={}] exit Reset 0.000434 1 0.000469
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880270004s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348007202s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880270004s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348007202s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850170135s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317924500s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880270004s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348007202s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880270004s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348007202s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880525589s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348281860s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850155830s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317924500s@ mbc={}] exit Reset 0.000029 1 0.000051
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880270004s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348007202s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850155830s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317924500s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850155830s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317924500s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850155830s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317924500s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850155830s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317924500s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.850155830s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317924500s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880499840s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348281860s@ mbc={}] exit Reset 0.000054 1 0.000091
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880499840s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348281860s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880499840s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348281860s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880499840s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348281860s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880499840s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348281860s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880499840s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348281860s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119609 8 0.000094
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125026 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125081 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125103 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880378723s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348236084s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880367279s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348236084s@ mbc={}] exit Reset 0.000024 1 0.000043
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880367279s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348236084s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880367279s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348236084s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880367279s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348236084s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880367279s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348236084s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880367279s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348236084s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.150056 20 0.000106
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.156274 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.156365 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119423 8 0.000059
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.124898 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.124997 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.156416 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125020 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119496 8 0.000055
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125068 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880339622s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348297119s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125150 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125211 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849840164s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317825317s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849815369s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] exit Reset 0.000059 1 0.000094
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880279541s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348289490s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849815369s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849815369s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849815369s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849815369s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880253792s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348289490s@ mbc={}] exit Reset 0.000053 1 0.000088
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849815369s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317825317s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880253792s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348289490s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880253792s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348289490s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880253792s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348289490s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880253792s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348289490s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880223274s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348297119s@ mbc={}] exit Reset 0.000131 1 0.000145
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880253792s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348289490s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880223274s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348297119s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880223274s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348297119s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880223274s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348297119s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880223274s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348297119s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880223274s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348297119s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.150997 20 0.000129
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.156847 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.156928 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.156957 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849054337s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317298889s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119778 8 0.000076
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125187 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125270 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849026680s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317298889s@ mbc={}] exit Reset 0.000053 1 0.000113
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125302 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.151138 20 0.000072
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.156988 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849026680s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317298889s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.157140 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849026680s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317298889s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.157173 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849026680s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317298889s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849026680s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317298889s@ mbc={}] exit Start 0.000013 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.849026680s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317298889s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848933220s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317253113s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880013466s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348335266s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848918915s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317253113s@ mbc={}] exit Reset 0.000031 1 0.000056
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848918915s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317253113s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848918915s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317253113s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848918915s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317253113s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848918915s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317253113s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848918915s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317253113s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879985809s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348335266s@ mbc={}] exit Reset 0.000077 1 0.000092
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879985809s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348335266s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879985809s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348335266s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879985809s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348335266s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879985809s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348335266s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879985809s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348335266s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119862 8 0.000060
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125151 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125235 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125271 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880131721s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348648071s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.151401 20 0.000075
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.157373 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880120277s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348648071s@ mbc={}] exit Reset 0.000027 1 0.000046
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880120277s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348648071s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.157535 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880120277s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348648071s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880120277s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348648071s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.157567 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880120277s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348648071s@ mbc={}] exit Start 0.000017 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880120277s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348648071s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.151460 20 0.000078
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.157479 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.157590 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848572731s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317153931s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.157659 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119916 8 0.000062
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848547935s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317153931s@ mbc={}] exit Reset 0.000051 1 0.000088
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125071 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848547935s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317153931s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125141 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848547935s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317153931s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125165 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848423004s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317047119s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848547935s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317153931s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848547935s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317153931s@ mbc={}] exit Start 0.000013 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848547935s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317153931s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.880009651s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348655701s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879998207s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348655701s@ mbc={}] exit Reset 0.000030 1 0.000051
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879998207s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348655701s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879998207s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348655701s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879998207s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348655701s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879998207s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348655701s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879998207s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348655701s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848355293s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317047119s@ mbc={}] exit Reset 0.000093 1 0.000115
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848355293s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317047119s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.151656 20 0.000068
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848355293s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317047119s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.157792 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848355293s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317047119s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848355293s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317047119s@ mbc={}] exit Start 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119986 8 0.000072
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.157863 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125042 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848355293s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317047119s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125146 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125177 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.157903 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879922867s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348686218s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879914284s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] exit Reset 0.000027 1 0.000046
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879914284s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848220825s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317001343s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879914284s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879914284s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879914284s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879914284s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848196983s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317001343s@ mbc={}] exit Reset 0.000051 1 0.000092
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.151698 20 0.000161
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848196983s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317001343s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.157830 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848196983s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317001343s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.157980 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848196983s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317001343s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.120154 8 0.000054
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125306 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125383 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.158028 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125408 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848196983s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317001343s@ mbc={}] exit Start 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879804611s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348670959s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848196983s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317001343s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879792213s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348670959s@ mbc={}] exit Reset 0.000025 1 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848138809s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.317008972s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879792213s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348670959s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879792213s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348670959s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879792213s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348670959s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879792213s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348670959s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879792213s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348670959s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848105431s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317008972s@ mbc={}] exit Reset 0.000064 1 0.000096
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.151871 20 0.000076
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.158083 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.158180 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848105431s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317008972s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.158200 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848105431s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317008972s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848105431s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317008972s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.120118 8 0.000097
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848002434s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316963196s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848105431s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317008972s@ mbc={}] exit Start 0.000017 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125213 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.848105431s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.317008972s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847991943s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316963196s@ mbc={}] exit Reset 0.000026 1 0.000043
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125350 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847991943s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316963196s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847991943s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316963196s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847991943s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316963196s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847991943s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316963196s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125391 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847991943s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316963196s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879694939s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348716736s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879670143s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348716736s@ mbc={}] exit Reset 0.000052 1 0.000091
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879670143s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348716736s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879670143s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348716736s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879670143s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348716736s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879670143s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348716736s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879670143s) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348716736s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.120355 8 0.000076
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125417 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125470 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125494 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879558563s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348678589s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.152116 20 0.000097
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879547119s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348678589s@ mbc={}] exit Reset 0.000025 1 0.000043
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.158314 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879547119s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348678589s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879547119s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348678589s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879547119s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348678589s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.158419 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879547119s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348678589s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879547119s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348678589s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.158459 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.152080 20 0.000071
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.120461 8 0.000049
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125467 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.158444 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125561 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125591 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847544670s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316764832s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.158564 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.158602 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879440308s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348686218s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847513199s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316764832s@ mbc={}] exit Reset 0.000090 1 0.000132
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879426003s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] exit Reset 0.000032 1 0.000061
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847513199s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316764832s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879426003s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879426003s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847513199s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316764832s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879426003s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879426003s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847513199s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316764832s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879426003s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348686218s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847513199s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316764832s@ mbc={}] exit Start 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847468376s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316741943s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847513199s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316764832s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.152949 20 0.000124
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847443581s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316741943s@ mbc={}] exit Reset 0.000052 1 0.000131
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.158731 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.158794 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847443581s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316741943s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.158818 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847443581s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316741943s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847443581s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316741943s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847443581s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316741943s@ mbc={}] exit Start 0.000013 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847066879s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316398621s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847443581s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316741943s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847057343s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316398621s@ mbc={}] exit Reset 0.000034 1 0.000041
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847057343s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316398621s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847057343s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316398621s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847057343s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316398621s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847057343s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316398621s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847057343s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316398621s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.153036 20 0.000100
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.158868 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.158923 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.153120 20 0.000102
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.158876 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.159051 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.158955 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.159087 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.120597 8 0.000056
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125538 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846938133s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316390991s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125617 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846924782s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316390991s@ mbc={}] exit Reset 0.000026 1 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846894264s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316352844s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846924782s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316390991s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125650 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846924782s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316390991s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846924782s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316390991s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846924782s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316390991s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846924782s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316390991s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846867561s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316352844s@ mbc={}] exit Reset 0.000054 1 0.000091
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846867561s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316352844s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846867561s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316352844s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879432678s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348930359s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846867561s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316352844s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846867561s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316352844s@ mbc={}] exit Start 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846867561s) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316352844s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879405975s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348930359s@ mbc={}] exit Reset 0.000053 1 0.000086
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.153210 20 0.000129
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.159117 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.159233 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.159264 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879405975s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348930359s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879405975s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348930359s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879405975s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348930359s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879405975s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348930359s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879405975s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348930359s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.156262 20 0.000153
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.159238 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.159391 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.159423 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.120737 8 0.000053
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125609 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125730 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843698502s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.313354492s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125776 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843666077s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.313354492s@ mbc={}] exit Reset 0.000058 1 0.000090
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843666077s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.313354492s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879385948s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.349082947s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843666077s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.313354492s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843666077s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.313354492s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843666077s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.313354492s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.843666077s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.313354492s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846693993s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316413879s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879362106s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349082947s@ mbc={}] exit Reset 0.000047 1 0.000078
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879362106s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349082947s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879362106s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349082947s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846668243s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316413879s@ mbc={}] exit Reset 0.000195 1 0.000209
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879362106s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349082947s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846668243s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316413879s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846668243s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316413879s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879362106s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349082947s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846668243s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316413879s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846668243s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316413879s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879362106s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349082947s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.846668243s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316413879s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.120910 8 0.000049
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125786 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125849 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125869 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 13.152763 20 0.000132
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 13.158981 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879122734s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.348976135s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 13.159213 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879112244s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348976135s@ mbc={}] exit Reset 0.000029 1 0.000047
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879112244s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348976135s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879112244s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348976135s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879112244s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348976135s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] exit Started 13.159245 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879112244s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348976135s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879112244s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.348976135s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=33) [2] r=0 lpr=33 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847078323s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 active pruub 89.316993713s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.119542 8 0.001503
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.125771 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.125910 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] exit Started 9.125936 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=37) [2] r=0 lpr=37 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847055435s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316993713s@ mbc={}] exit Reset 0.000054 1 0.000087
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847055435s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316993713s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847055435s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316993713s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879151344s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 active pruub 93.349098206s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847055435s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316993713s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879140854s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349098206s@ mbc={}] exit Reset 0.000023 1 0.000039
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847055435s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316993713s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879140854s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349098206s@ mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879140854s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349098206s@ mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.847055435s) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY pruub 89.316993713s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879140854s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349098206s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879140854s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349098206s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.879140854s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY pruub 93.349098206s@ mbc={}] enter Started/Stray
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000039 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000008
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000062 1 0.000030
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008148 2 0.000036
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000083 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000019
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000019 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000100 1 0.000062
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000046 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000056 1 0.000031
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000029 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000036 1 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000029 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000060 1 0.000023
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000027 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000014
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000054 1 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000027 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000040 1 0.000022
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000032 1 0.000022
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000027 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000006
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000034 1 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000038 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000010
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000061 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000032 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000009
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000044 1 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000075 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000017
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000114 1 0.000036
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000051 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000035 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000008
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000030
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000065 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000056 1 0.000035
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000035 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000009
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000048 1 0.000022
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000131 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000040
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000063 1 0.000041
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000051 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000010
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000052 1 0.000028
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000093 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000023
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000061 1 0.000037
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008435 2 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.001248 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019787 2 0.000053
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019538 2 0.000028
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019377 2 0.000044
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.018921 2 0.000038
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.018665 2 0.000033
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.018447 2 0.000041
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019925 2 0.000027
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.020758 2 0.000017
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021125 2 0.000018
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021272 2 0.000017
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000022 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.020637 2 0.000015
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021035 2 0.000018
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019937 2 0.000019
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019753 2 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019395 2 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.020828 2 0.000016
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.018935 2 0.000031
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.018672 2 0.000040
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.018521 2 0.000019
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.018175 2 0.000041
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.017610 2 0.000039
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021902 2 0.000047
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019298 2 0.000022
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021733 2 0.000022
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.022705 2 0.000038
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000025 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.018571 2 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:08.195756+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 11 sent 9 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:37.372729+0000 osd.2 (osd.2) 10 : cluster [DBG] 2.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:37.383371+0000 osd.2 (osd.2) 11 : cluster [DBG] 2.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 60792832 unmapped: 1064960 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 11)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:37.372729+0000 osd.2 (osd.2) 10 : cluster [DBG] 2.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:37.383371+0000 osd.2 (osd.2) 11 : cluster [DBG] 2.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 41 handle_osd_map epochs [41,42], i have 41, src has [1,42]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 41 handle_osd_map epochs [42,42], i have 42, src has [1,42]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.993374 2 0.000040
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.005419 2 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013315 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013728 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.009881 2 0.000039
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.014236 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.993517 2 0.000026
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013687 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.993717 2 0.000062
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 42 handle_osd_map epochs [42,42], i have 42, src has [1,42]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.992883 2 0.000083
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013489 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.994405 2 0.000032
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013179 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013965 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.994470 2 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013048 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984112 2 0.000933
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.007033 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.000753 2 0.001300
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.010552 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984730 2 0.000048
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006785 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984862 2 0.000053
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006700 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985435 2 0.000060
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006682 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985559 2 0.000255
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006948 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985789 2 0.000044
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006637 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985694 2 0.000166
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006832 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985804 2 0.000020
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006699 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986019 2 0.000022
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006729 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986004 2 0.000027
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005575 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986242 2 0.000020
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.006086 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986228 2 0.000026
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005620 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986447 2 0.000016
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005218 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986509 2 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005541 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986698 2 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005314 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986698 2 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.004983 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.994808 2 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986955 2 0.000018
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.016126 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.004664 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986443 2 0.000038
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.005117 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986270 2 0.000032
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.007492 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004449 4 0.000190
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007700 4 0.001052
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000014 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1b( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006785 4 0.000692
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.a( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007729 4 0.000958
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.18( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006802 4 0.000124
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.11( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006724 4 0.001055
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.e( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006708 4 0.000134
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1c( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006498 4 0.000091
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.18( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006553 4 0.000192
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006427 4 0.000067
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000016 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006307 4 0.000066
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.16( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007182 4 0.001376
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.11( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.1( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.018497 7 0.000036
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000086 1 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.018430 7 0.000036
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.018244 7 0.000085
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000102 1 0.000355
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000133 1 0.000397
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009990 4 0.000074
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010139 4 0.000131
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010294 4 0.000095
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.11( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.15( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009930 4 0.000153
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000024 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009880 4 0.000069
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.8( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009792 4 0.000068
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.5( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009734 4 0.000106
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009949 4 0.000128
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.2( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009596 4 0.000099
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.7( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009526 4 0.000080
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.c( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009241 4 0.000083
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1d( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008990 4 0.000068
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009526 4 0.000099
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.1a( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.8( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009010 4 0.002393
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008910 4 0.000089
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[4.13( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.1e( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008760 4 0.001266
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[3.5( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [2] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.009568 4 0.000120
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[7.e( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [2] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.020983 7 0.000055
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000126 1 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.023549 7 0.000042
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.025525 7 0.000371
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.023701 7 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.023006 7 0.000059
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.022872 7 0.000068
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.022320 7 0.000079
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.022954 7 0.000075
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.022637 7 0.000049
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.022391 7 0.000042
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000136 1 0.000040
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000185 1 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000243 1 0.000016
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000349 1 0.000014
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000414 1 0.000023
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000502 1 0.000016
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000599 1 0.000017
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000697 1 0.000017
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000795 1 0.000015
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.028145 7 0.000055
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000104 1 0.000087
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029354 7 0.000093
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029254 7 0.000072
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.028701 7 0.000061
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029388 7 0.000094
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000134 1 0.000046
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029333 7 0.000039
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029867 7 0.000064
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.028649 7 0.000042
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029255 7 0.000668
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029277 7 0.000044
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000307 1 0.000056
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.028829 7 0.000080
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029039 7 0.000099
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.034380 7 0.000065
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029279 7 0.000051
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000406 1 0.000044
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.028691 7 0.000038
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000355 1 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.13( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.012394 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.13( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.012500 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.13( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.031035 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000325 1 0.000023
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000427 1 0.000015
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000572 1 0.000016
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029921 7 0.000078
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029300 7 0.000072
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000704 1 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031004 7 0.000062
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.032218 7 0.000096
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030653 7 0.000035
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029598 7 0.000069
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030761 7 0.000057
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030602 7 0.000038
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000962 1 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031560 7 0.000038
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031802 7 0.000065
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.034694 7 0.000062
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030741 7 0.000078
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.036090 7 0.000049
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001882 1 0.000033
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031310 7 0.000049
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.036399 7 0.000066
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.002122 1 0.000019
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.002177 1 0.000034
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.002247 1 0.000017
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.002285 1 0.000036
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001743 1 0.000044
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001812 1 0.000065
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001775 1 0.000165
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001766 1 0.000028
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001844 1 0.000071
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001875 1 0.000981
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001210 1 0.001004
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001177 1 0.000044
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001257 1 0.000085
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001354 1 0.001077
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001413 1 0.000042
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001445 1 0.000018
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001500 1 0.000035
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001367 1 0.000313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.001385 1 0.000057
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.14( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.017415 1 0.000054
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.14( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.017598 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.14( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.036357 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.15( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.020853 1 0.000028
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.15( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.021033 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.15( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.039683 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.11( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.025814 1 0.000088
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.11( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.026023 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.11( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.047044 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 42 heartbeat osd_stat(store_statfs(0x4fe15f000/0x0/0x4ffc00000, data 0x2b2c7/0x6b000, compress 0x0/0x0/0x0, omap 0x4d8e, meta 0x1a2b272), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.11( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.031684 1 0.000030
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.11( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.031853 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.11( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.055439 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1b( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.038941 1 0.000042
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1b( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.039150 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1b( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.064914 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.17( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.046211 1 0.000047
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.17( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.046496 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.17( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.070219 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.15( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.053627 1 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.15( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.054019 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.15( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.077046 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.13( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.060944 1 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.13( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.061387 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.13( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.084293 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.9( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.068529 1 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.9( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.069073 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.9( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.091430 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.12( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.075563 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.12( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.076262 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.12( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.099262 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.16( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.082819 1 0.000020
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.16( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.083545 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.16( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.106206 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.d( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.090008 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.d( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.090833 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.d( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.113245 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.b( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.091147 1 0.000053
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.b( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.091303 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.b( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.119504 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.3( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.098093 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.3( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.098255 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.3( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.127688 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.a( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.105365 1 0.000022
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.a( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.105705 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.a( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.134473 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.5( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.112625 1 0.000027
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.5( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.113070 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.5( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.142357 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.4( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.119875 1 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.4( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.120258 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.4( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.149711 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.127445 1 0.000078
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.127831 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.157203 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.19( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.134823 1 0.000036
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.19( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.135443 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.19( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.164116 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.7( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.142319 1 0.000127
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.7( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.142810 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.7( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.172716 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.f( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.149099 1 0.000072
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.f( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.149875 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.f( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.179178 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.6( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.156347 1 0.000045
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.6( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.157353 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.6( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.186656 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1a( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.162863 1 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1a( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.164811 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1a( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.193686 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.c( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.169838 1 0.000038
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.c( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.172001 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.c( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.201102 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1d( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.177035 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1d( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.179243 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1d( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.213665 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.9( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.184399 1 0.000040
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.9( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.186675 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.9( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.215976 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.18( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.191819 1 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.18( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.194135 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.18( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.222857 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1f( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.199001 1 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1f( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.200777 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1f( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.230124 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.8( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.206388 1 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.8( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.208237 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.8( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.238224 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.16( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.213750 1 0.000046
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.16( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.215576 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.16( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.247918 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.3( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.221190 1 0.000027
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.3( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.222994 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.3( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.253621 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.2( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.228321 1 0.000030
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.2( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.230214 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.2( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.261014 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.5( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.235660 1 0.000025
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.5( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.237579 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.5( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.268459 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.f( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.243151 1 0.000027
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.f( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.245354 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.f( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.276401 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.7( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.250462 1 0.000026
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.7( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.251683 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.7( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.283531 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.4( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.257652 1 0.000057
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.4( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.258989 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.4( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.290595 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1c( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.265265 1 0.000029
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1c( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.266651 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1c( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.297327 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.18( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.272596 1 0.000024
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.18( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.274046 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.18( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.308779 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1d( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.279719 1 0.000022
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1d( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.281192 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.1d( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.311969 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.19( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.286963 1 0.000027
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.19( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.288495 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[2.19( empty lb MIN local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=-1 lpr=41 pi=[33,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.324616 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.2( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.294668 1 0.000018
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.2( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.296084 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.2( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.327706 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1e( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.301794 1 0.000021
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1e( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.303236 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 42 pg[5.1e( empty lb MIN local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=-1 lpr=41 pi=[37,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.339681 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:09.196042+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61120512 unmapped: 737280 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 306159 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 42 handle_osd_map epochs [42,43], i have 42, src has [1,43]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:10.196267+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61128704 unmapped: 729088 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:11.196427+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 13 sent 11 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:40.248504+0000 osd.2 (osd.2) 12 : cluster [DBG] 5.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:40.259049+0000 osd.2 (osd.2) 13 : cluster [DBG] 5.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61194240 unmapped: 663552 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 13)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:40.248504+0000 osd.2 (osd.2) 12 : cluster [DBG] 5.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:40.259049+0000 osd.2 (osd.2) 13 : cluster [DBG] 5.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 43 handle_osd_map epochs [44,44], i have 43, src has [1,44]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 44 heartbeat osd_stat(store_statfs(0x4fe155000/0x0/0x4ffc00000, data 0x2dd6d/0x71000, compress 0x0/0x0/0x0, omap 0x52a4, meta 0x1a2ad5c), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:12.196766+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 15 sent 13 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:41.252532+0000 osd.2 (osd.2) 14 : cluster [DBG] 2.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:41.263119+0000 osd.2 (osd.2) 15 : cluster [DBG] 2.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61218816 unmapped: 638976 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 15)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:41.252532+0000 osd.2 (osd.2) 14 : cluster [DBG] 2.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:41.263119+0000 osd.2 (osd.2) 15 : cluster [DBG] 2.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:13.197001+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 17 sent 15 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:42.297195+0000 osd.2 (osd.2) 16 : cluster [DBG] 5.1f scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:42.307798+0000 osd.2 (osd.2) 17 : cluster [DBG] 5.1f scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61227008 unmapped: 630784 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 17)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:42.297195+0000 osd.2 (osd.2) 16 : cluster [DBG] 5.1f scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:42.307798+0000 osd.2 (osd.2) 17 : cluster [DBG] 5.1f scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:14.197299+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 19 sent 17 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:43.251020+0000 osd.2 (osd.2) 18 : cluster [DBG] 5.10 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:43.261621+0000 osd.2 (osd.2) 19 : cluster [DBG] 5.10 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61267968 unmapped: 589824 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 320891 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 19)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:43.251020+0000 osd.2 (osd.2) 18 : cluster [DBG] 5.10 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:43.261621+0000 osd.2 (osd.2) 19 : cluster [DBG] 5.10 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:15.197548+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61267968 unmapped: 589824 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 44 heartbeat osd_stat(store_statfs(0x4fe156000/0x0/0x4ffc00000, data 0x2f1ed/0x74000, compress 0x0/0x0/0x0, omap 0x552f, meta 0x1a2aad1), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:16.197713+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 21 sent 19 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:45.197899+0000 osd.2 (osd.2) 20 : cluster [DBG] 2.14 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:45.208461+0000 osd.2 (osd.2) 21 : cluster [DBG] 2.14 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61276160 unmapped: 581632 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 21)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:45.197899+0000 osd.2 (osd.2) 20 : cluster [DBG] 2.14 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:45.208461+0000 osd.2 (osd.2) 21 : cluster [DBG] 2.14 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:17.197984+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61276160 unmapped: 581632 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.571439743s of 10.756100655s, submitted: 333
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:18.198180+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 23 sent 21 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:48.128880+0000 osd.2 (osd.2) 22 : cluster [DBG] 2.12 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:48.139450+0000 osd.2 (osd.2) 23 : cluster [DBG] 2.12 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61341696 unmapped: 516096 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 23)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:48.128880+0000 osd.2 (osd.2) 22 : cluster [DBG] 2.12 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:48.139450+0000 osd.2 (osd.2) 23 : cluster [DBG] 2.12 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:19.198414+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 44 handle_osd_map epochs [45,46], i have 44, src has [1,46]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61399040 unmapped: 458752 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 331693 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:20.198588+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 25 sent 23 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:50.113817+0000 osd.2 (osd.2) 24 : cluster [DBG] 2.10 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:50.124381+0000 osd.2 (osd.2) 25 : cluster [DBG] 2.10 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61431808 unmapped: 425984 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 25)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:50.113817+0000 osd.2 (osd.2) 24 : cluster [DBG] 2.10 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:50.124381+0000 osd.2 (osd.2) 25 : cluster [DBG] 2.10 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:21.198786+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 27 sent 25 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:51.087939+0000 osd.2 (osd.2) 26 : cluster [DBG] 5.17 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:51.098507+0000 osd.2 (osd.2) 27 : cluster [DBG] 5.17 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61456384 unmapped: 401408 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 46 heartbeat osd_stat(store_statfs(0x4fe152000/0x0/0x4ffc00000, data 0x31c83/0x7a000, compress 0x0/0x0/0x0, omap 0x57ba, meta 0x1a2a846), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 46 handle_osd_map epochs [47,47], i have 46, src has [1,47]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 27)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:51.087939+0000 osd.2 (osd.2) 26 : cluster [DBG] 5.17 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:51.098507+0000 osd.2 (osd.2) 27 : cluster [DBG] 5.17 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:22.198988+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 29 sent 27 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:52.064043+0000 osd.2 (osd.2) 28 : cluster [DBG] 5.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:52.074533+0000 osd.2 (osd.2) 29 : cluster [DBG] 5.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61472768 unmapped: 385024 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 29)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:52.064043+0000 osd.2 (osd.2) 28 : cluster [DBG] 5.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:52.074533+0000 osd.2 (osd.2) 29 : cluster [DBG] 5.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 47 handle_osd_map epochs [48,49], i have 47, src has [1,49]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:23.199199+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61489152 unmapped: 368640 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:24.199360+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61489152 unmapped: 368640 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 347710 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:25.199539+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 31 sent 29 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:55.085517+0000 osd.2 (osd.2) 30 : cluster [DBG] 2.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:10:55.096085+0000 osd.2 (osd.2) 31 : cluster [DBG] 2.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61505536 unmapped: 352256 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 31)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:55.085517+0000 osd.2 (osd.2) 30 : cluster [DBG] 2.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:10:55.096085+0000 osd.2 (osd.2) 31 : cluster [DBG] 2.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:26.199785+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61538304 unmapped: 319488 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:27.199926+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 49 heartbeat osd_stat(store_statfs(0x4fe147000/0x0/0x4ffc00000, data 0x35d37/0x83000, compress 0x0/0x0/0x0, omap 0x5cd0, meta 0x1a2a330), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 49 handle_osd_map epochs [50,51], i have 49, src has [1,51]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61554688 unmapped: 303104 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 51 handle_osd_map epochs [51,52], i have 51, src has [1,52]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:28.200080+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61456384 unmapped: 401408 heap: 61857792 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=0 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000111 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=0 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000036
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000015 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000172 1 0.000065
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.001206398s of 11.043487549s, submitted: 15
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000506 2 0.000091
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 52 pg[6.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:29.200198+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61489152 unmapped: 1417216 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 358231 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 52 handle_osd_map epochs [52,53], i have 52, src has [1,53]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.734591 2 0.000068
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.735357 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=37/21 lis/c=52/37 les/c/f=53/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002088 3 0.000261
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=37/21 lis/c=52/37 les/c/f=53/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=37/21 lis/c=52/37 les/c/f=53/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=52/53 n=1 ec=37/21 lis/c=52/37 les/c/f=53/38/0 sis=52) [2] r=0 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 53 handle_osd_map epochs [52,53], i have 53, src has [1,53]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:30.200341+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 53 handle_osd_map epochs [53,53], i have 53, src has [1,53]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61505536 unmapped: 1400832 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:31.200492+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61497344 unmapped: 1409024 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:32.200723+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61505536 unmapped: 1400832 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 53 handle_osd_map epochs [54,55], i have 53, src has [1,55]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:33.200963+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 55 heartbeat osd_stat(store_statfs(0x4fe133000/0x0/0x4ffc00000, data 0x3e035/0x95000, compress 0x0/0x0/0x0, omap 0x66fc, meta 0x1a29904), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61521920 unmapped: 1384448 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 55 handle_osd_map epochs [55,56], i have 55, src has [1,56]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:34.201094+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 33 sent 31 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:04.164293+0000 osd.2 (osd.2) 32 : cluster [DBG] 5.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:04.174858+0000 osd.2 (osd.2) 33 : cluster [DBG] 5.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61530112 unmapped: 1376256 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 373346 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 56 handle_osd_map epochs [56,57], i have 56, src has [1,57]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 33)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:04.164293+0000 osd.2 (osd.2) 32 : cluster [DBG] 5.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:04.174858+0000 osd.2 (osd.2) 33 : cluster [DBG] 5.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:35.201367+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 35 sent 33 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:05.116119+0000 osd.2 (osd.2) 34 : cluster [DBG] 5.b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:05.126638+0000 osd.2 (osd.2) 35 : cluster [DBG] 5.b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 61571072 unmapped: 1335296 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 35)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:05.116119+0000 osd.2 (osd.2) 34 : cluster [DBG] 5.b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:05.126638+0000 osd.2 (osd.2) 35 : cluster [DBG] 5.b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:36.201594+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 37 sent 35 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:06.129452+0000 osd.2 (osd.2) 36 : cluster [DBG] 2.c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:06.139954+0000 osd.2 (osd.2) 37 : cluster [DBG] 2.c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 57 heartbeat osd_stat(store_statfs(0x4fe12f000/0x0/0x4ffc00000, data 0x40acb/0x9b000, compress 0x0/0x0/0x0, omap 0x6c12, meta 0x1a293ee), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 57 handle_osd_map epochs [58,58], i have 57, src has [1,58]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 57 handle_osd_map epochs [58,58], i have 58, src has [1,58]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62611456 unmapped: 294912 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 37)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:06.129452+0000 osd.2 (osd.2) 36 : cluster [DBG] 2.c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:06.139954+0000 osd.2 (osd.2) 37 : cluster [DBG] 2.c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:37.201791+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62611456 unmapped: 294912 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 58 handle_osd_map epochs [58,59], i have 58, src has [1,59]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:38.201978+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62619648 unmapped: 286720 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.0 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.0 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:39.202253+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 39 sent 37 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:09.116007+0000 osd.2 (osd.2) 38 : cluster [DBG] 2.0 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:09.126609+0000 osd.2 (osd.2) 39 : cluster [DBG] 2.0 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62627840 unmapped: 278528 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 387999 data_alloc: 218103808 data_used: 0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 59 heartbeat osd_stat(store_statfs(0x4fe125000/0x0/0x4ffc00000, data 0x436f7/0xa1000, compress 0x0/0x0/0x0, omap 0x7128, meta 0x1a28ed8), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 39)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:09.116007+0000 osd.2 (osd.2) 38 : cluster [DBG] 2.0 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:09.126609+0000 osd.2 (osd.2) 39 : cluster [DBG] 2.0 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.855418205s of 10.907942772s, submitted: 19
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:40.202547+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 41 sent 39 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:10.080385+0000 osd.2 (osd.2) 40 : cluster [DBG] 5.0 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:10.090942+0000 osd.2 (osd.2) 41 : cluster [DBG] 5.0 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62636032 unmapped: 270336 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 41)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:10.080385+0000 osd.2 (osd.2) 40 : cluster [DBG] 5.0 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:10.090942+0000 osd.2 (osd.2) 41 : cluster [DBG] 5.0 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:41.202759+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 43 sent 41 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:11.040926+0000 osd.2 (osd.2) 42 : cluster [DBG] 2.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:11.051468+0000 osd.2 (osd.2) 43 : cluster [DBG] 2.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62676992 unmapped: 229376 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:42.203036+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 43)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:11.040926+0000 osd.2 (osd.2) 42 : cluster [DBG] 2.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:11.051468+0000 osd.2 (osd.2) 43 : cluster [DBG] 2.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62685184 unmapped: 221184 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:43.203222+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 60 handle_osd_map epochs [61,61], i have 60, src has [1,61]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f(unlocked)] enter Initial
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=0 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000118 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=0 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000034 1 0.000063
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000097 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000324 1 0.000244
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 61 handle_osd_map epochs [60,61], i have 61, src has [1,61]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62693376 unmapped: 212992 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/GetLog 0.000613 2 0.000083
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/GetMissing 0.000009 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 61 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:44.203366+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 61 handle_osd_map epochs [61,62], i have 61, src has [1,62]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 61 handle_osd_map epochs [61,62], i have 62, src has [1,62]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/WaitUpThru 0.761318 2 0.000212
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering 0.762372 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 0'0 unknown m=3 mbc={}] enter Started/Primary/Active
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/Activating
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/Activating 0.003264 3 0.000203
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000103 1 0.000101
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000005 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 handle_osd_map epochs [62,62], i have 62, src has [1,62]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.126923 3 0.000051
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000025 0 0.000000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=61/62 n=1 ec=37/21 lis/c=61/45 les/c/f=62/46/0 sis=61) [2] r=0 lpr=61 pi=[45,61)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62750720 unmapped: 155648 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 407607 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe11e000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:45.203534+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 45 sent 43 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:14.967980+0000 osd.2 (osd.2) 44 : cluster [DBG] 5.6 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:14.978552+0000 osd.2 (osd.2) 45 : cluster [DBG] 5.6 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe11e000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62767104 unmapped: 139264 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 45)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:14.967980+0000 osd.2 (osd.2) 44 : cluster [DBG] 5.6 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:14.978552+0000 osd.2 (osd.2) 45 : cluster [DBG] 5.6 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:46.203840+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62775296 unmapped: 131072 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:47.203987+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 47 sent 45 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:16.966311+0000 osd.2 (osd.2) 46 : cluster [DBG] 5.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:16.976901+0000 osd.2 (osd.2) 47 : cluster [DBG] 5.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe11e000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62668800 unmapped: 237568 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 47)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:16.966311+0000 osd.2 (osd.2) 46 : cluster [DBG] 5.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:16.976901+0000 osd.2 (osd.2) 47 : cluster [DBG] 5.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:48.204230+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62668800 unmapped: 237568 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:49.204401+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62668800 unmapped: 237568 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 411421 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:50.204528+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 49 sent 47 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:20.016807+0000 osd.2 (osd.2) 48 : cluster [DBG] 5.d scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:20.027547+0000 osd.2 (osd.2) 49 : cluster [DBG] 5.d scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62685184 unmapped: 221184 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.859597206s of 10.914040565s, submitted: 21
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 49)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:20.016807+0000 osd.2 (osd.2) 48 : cluster [DBG] 5.d scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:20.027547+0000 osd.2 (osd.2) 49 : cluster [DBG] 5.d scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:51.204800+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 51 sent 49 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:20.994452+0000 osd.2 (osd.2) 50 : cluster [DBG] 5.1b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:21.004969+0000 osd.2 (osd.2) 51 : cluster [DBG] 5.1b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62685184 unmapped: 221184 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 51)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:20.994452+0000 osd.2 (osd.2) 50 : cluster [DBG] 5.1b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:21.004969+0000 osd.2 (osd.2) 51 : cluster [DBG] 5.1b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:52.205808+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62701568 unmapped: 204800 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:53.206041+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62701568 unmapped: 204800 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:54.206165+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62701568 unmapped: 204800 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 416245 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:55.206359+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62709760 unmapped: 196608 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:56.206664+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 53 sent 51 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:26.121981+0000 osd.2 (osd.2) 52 : cluster [DBG] 4.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:26.132442+0000 osd.2 (osd.2) 53 : cluster [DBG] 4.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62734336 unmapped: 172032 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 53)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:26.121981+0000 osd.2 (osd.2) 52 : cluster [DBG] 4.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:26.132442+0000 osd.2 (osd.2) 53 : cluster [DBG] 4.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:57.206914+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62742528 unmapped: 163840 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:58.207093+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62742528 unmapped: 163840 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:59.207264+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:29.168142+0000 osd.2 (osd.2) 54 : cluster [DBG] 4.1b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:29.178711+0000 osd.2 (osd.2) 55 : cluster [DBG] 4.1b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62750720 unmapped: 155648 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 421071 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:00.207481+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 55)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:29.168142+0000 osd.2 (osd.2) 54 : cluster [DBG] 4.1b scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:29.178711+0000 osd.2 (osd.2) 55 : cluster [DBG] 4.1b scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62750720 unmapped: 155648 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:01.207614+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62750720 unmapped: 155648 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.099511147s of 11.110563278s, submitted: 6
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:02.207752+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:32.104935+0000 osd.2 (osd.2) 56 : cluster [DBG] 4.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:32.115478+0000 osd.2 (osd.2) 57 : cluster [DBG] 4.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 57)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:32.104935+0000 osd.2 (osd.2) 56 : cluster [DBG] 4.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:32.115478+0000 osd.2 (osd.2) 57 : cluster [DBG] 4.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62767104 unmapped: 139264 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:03.207943+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 59 sent 57 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:33.149898+0000 osd.2 (osd.2) 58 : cluster [DBG] 4.18 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:33.160452+0000 osd.2 (osd.2) 59 : cluster [DBG] 4.18 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 59)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:33.149898+0000 osd.2 (osd.2) 58 : cluster [DBG] 4.18 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:33.160452+0000 osd.2 (osd.2) 59 : cluster [DBG] 4.18 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62783488 unmapped: 122880 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:04.208120+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62791680 unmapped: 114688 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 425895 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:05.208231+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 61 sent 59 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:35.159020+0000 osd.2 (osd.2) 60 : cluster [DBG] 4.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:35.169684+0000 osd.2 (osd.2) 61 : cluster [DBG] 4.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 61)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:35.159020+0000 osd.2 (osd.2) 60 : cluster [DBG] 4.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:35.169684+0000 osd.2 (osd.2) 61 : cluster [DBG] 4.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62791680 unmapped: 114688 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:06.208389+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62791680 unmapped: 114688 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:07.208525+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62799872 unmapped: 106496 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:08.208717+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62799872 unmapped: 106496 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:09.208886+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62808064 unmapped: 98304 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 428308 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:10.209040+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 63 sent 61 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:40.195782+0000 osd.2 (osd.2) 62 : cluster [DBG] 4.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:40.206247+0000 osd.2 (osd.2) 63 : cluster [DBG] 4.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 63)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:40.195782+0000 osd.2 (osd.2) 62 : cluster [DBG] 4.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:40.206247+0000 osd.2 (osd.2) 63 : cluster [DBG] 4.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62808064 unmapped: 98304 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:11.209217+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 1 last_log 64 sent 63 num 1 unsent 1 sending 1
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:41.203443+0000 osd.2 (osd.2) 64 : cluster [DBG] 3.18 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 64)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:41.203443+0000 osd.2 (osd.2) 64 : cluster [DBG] 3.18 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62816256 unmapped: 90112 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:12.209419+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 1 last_log 65 sent 64 num 1 unsent 1 sending 1
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:41.214016+0000 osd.2 (osd.2) 65 : cluster [DBG] 3.18 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.105876923s of 10.125753403s, submitted: 10
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 65)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:41.214016+0000 osd.2 (osd.2) 65 : cluster [DBG] 3.18 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 81920 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:13.209631+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:42.230851+0000 osd.2 (osd.2) 66 : cluster [DBG] 7.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:42.241414+0000 osd.2 (osd.2) 67 : cluster [DBG] 7.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 67)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:42.230851+0000 osd.2 (osd.2) 66 : cluster [DBG] 7.1c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:42.241414+0000 osd.2 (osd.2) 67 : cluster [DBG] 7.1c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 81920 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:14.209779+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:44.186057+0000 osd.2 (osd.2) 68 : cluster [DBG] 3.16 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:44.196617+0000 osd.2 (osd.2) 69 : cluster [DBG] 3.16 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 69)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:44.186057+0000 osd.2 (osd.2) 68 : cluster [DBG] 3.16 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:44.196617+0000 osd.2 (osd.2) 69 : cluster [DBG] 3.16 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62840832 unmapped: 65536 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 437958 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:15.209921+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62840832 unmapped: 65536 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:16.210049+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62840832 unmapped: 65536 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:17.210193+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62873600 unmapped: 32768 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:18.210325+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:47.280854+0000 osd.2 (osd.2) 70 : cluster [DBG] 7.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:47.291412+0000 osd.2 (osd.2) 71 : cluster [DBG] 7.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 71)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:47.280854+0000 osd.2 (osd.2) 70 : cluster [DBG] 7.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:47.291412+0000 osd.2 (osd.2) 71 : cluster [DBG] 7.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62873600 unmapped: 32768 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:19.210503+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62881792 unmapped: 24576 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 440371 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:20.210666+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62889984 unmapped: 16384 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:21.210813+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62898176 unmapped: 8192 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:22.210961+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62906368 unmapped: 0 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:23.211123+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.043985367s of 11.063570023s, submitted: 6
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62914560 unmapped: 1040384 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:24.211262+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:53.294122+0000 osd.2 (osd.2) 72 : cluster [DBG] 4.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:53.304844+0000 osd.2 (osd.2) 73 : cluster [DBG] 4.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 73)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:53.294122+0000 osd.2 (osd.2) 72 : cluster [DBG] 4.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:53.304844+0000 osd.2 (osd.2) 73 : cluster [DBG] 4.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62922752 unmapped: 1032192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 442782 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:25.211476+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62922752 unmapped: 1032192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:26.211664+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:55.319112+0000 osd.2 (osd.2) 74 : cluster [DBG] 7.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:55.329812+0000 osd.2 (osd.2) 75 : cluster [DBG] 7.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 75)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:55.319112+0000 osd.2 (osd.2) 74 : cluster [DBG] 7.a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:55.329812+0000 osd.2 (osd.2) 75 : cluster [DBG] 7.a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62922752 unmapped: 1032192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:27.211892+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:56.324761+0000 osd.2 (osd.2) 76 : cluster [DBG] 3.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:56.335383+0000 osd.2 (osd.2) 77 : cluster [DBG] 3.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 77)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:56.324761+0000 osd.2 (osd.2) 76 : cluster [DBG] 3.11 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:56.335383+0000 osd.2 (osd.2) 77 : cluster [DBG] 3.11 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62930944 unmapped: 1024000 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:28.212190+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:57.328208+0000 osd.2 (osd.2) 78 : cluster [DBG] 7.15 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:57.338655+0000 osd.2 (osd.2) 79 : cluster [DBG] 7.15 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 79)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:57.328208+0000 osd.2 (osd.2) 78 : cluster [DBG] 7.15 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:57.338655+0000 osd.2 (osd.2) 79 : cluster [DBG] 7.15 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62939136 unmapped: 1015808 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:29.212495+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62947328 unmapped: 1007616 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 452430 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:30.212647+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:59.343361+0000 osd.2 (osd.2) 80 : cluster [DBG] 3.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:59.353960+0000 osd.2 (osd.2) 81 : cluster [DBG] 3.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 81)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:59.343361+0000 osd.2 (osd.2) 80 : cluster [DBG] 3.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:59.353960+0000 osd.2 (osd.2) 81 : cluster [DBG] 3.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62955520 unmapped: 999424 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:31.212874+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62963712 unmapped: 991232 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:32.213040+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62971904 unmapped: 983040 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:33.213255+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:02.380735+0000 osd.2 (osd.2) 82 : cluster [DBG] 7.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:02.391278+0000 osd.2 (osd.2) 83 : cluster [DBG] 7.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 83)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:02.380735+0000 osd.2 (osd.2) 82 : cluster [DBG] 7.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:02.391278+0000 osd.2 (osd.2) 83 : cluster [DBG] 7.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62980096 unmapped: 974848 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:34.213509+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.994940758s of 11.102183342s, submitted: 12
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63012864 unmapped: 942080 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 457252 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:35.213680+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:04.396685+0000 osd.2 (osd.2) 84 : cluster [DBG] 7.5 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:04.407272+0000 osd.2 (osd.2) 85 : cluster [DBG] 7.5 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 85)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:04.396685+0000 osd.2 (osd.2) 84 : cluster [DBG] 7.5 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:04.407272+0000 osd.2 (osd.2) 85 : cluster [DBG] 7.5 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63012864 unmapped: 942080 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:36.213871+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:05.348833+0000 osd.2 (osd.2) 86 : cluster [DBG] 7.2 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:05.359416+0000 osd.2 (osd.2) 87 : cluster [DBG] 7.2 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 87)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:05.348833+0000 osd.2 (osd.2) 86 : cluster [DBG] 7.2 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:05.359416+0000 osd.2 (osd.2) 87 : cluster [DBG] 7.2 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63021056 unmapped: 933888 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:37.214074+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63021056 unmapped: 933888 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:38.214245+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:07.350511+0000 osd.2 (osd.2) 88 : cluster [DBG] 7.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:07.361110+0000 osd.2 (osd.2) 89 : cluster [DBG] 7.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 89)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:07.350511+0000 osd.2 (osd.2) 88 : cluster [DBG] 7.1 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:07.361110+0000 osd.2 (osd.2) 89 : cluster [DBG] 7.1 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63029248 unmapped: 925696 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:39.214504+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63029248 unmapped: 925696 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 462074 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:40.214688+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63045632 unmapped: 909312 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:41.214935+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63053824 unmapped: 901120 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:42.215108+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63053824 unmapped: 901120 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:43.215336+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63062016 unmapped: 892928 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:44.215547+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:13.415133+0000 osd.2 (osd.2) 90 : cluster [DBG] 3.7 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:13.425765+0000 osd.2 (osd.2) 91 : cluster [DBG] 3.7 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63070208 unmapped: 884736 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 464485 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 91)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:13.415133+0000 osd.2 (osd.2) 90 : cluster [DBG] 3.7 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:13.425765+0000 osd.2 (osd.2) 91 : cluster [DBG] 3.7 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:45.215805+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63070208 unmapped: 884736 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:46.215980+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.941810608s of 11.959441185s, submitted: 8
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63078400 unmapped: 876544 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:47.216162+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:16.356251+0000 osd.2 (osd.2) 92 : cluster [DBG] 7.c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:16.366904+0000 osd.2 (osd.2) 93 : cluster [DBG] 7.c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63094784 unmapped: 860160 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 93)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:16.356251+0000 osd.2 (osd.2) 92 : cluster [DBG] 7.c scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:16.366904+0000 osd.2 (osd.2) 93 : cluster [DBG] 7.c scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:48.216325+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:17.373947+0000 osd.2 (osd.2) 94 : cluster [DBG] 3.1d scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:17.384530+0000 osd.2 (osd.2) 95 : cluster [DBG] 3.1d scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63111168 unmapped: 843776 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 95)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:17.373947+0000 osd.2 (osd.2) 94 : cluster [DBG] 3.1d scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:17.384530+0000 osd.2 (osd.2) 95 : cluster [DBG] 3.1d scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:49.216489+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:18.407332+0000 osd.2 (osd.2) 96 : cluster [DBG] 7.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:18.417911+0000 osd.2 (osd.2) 97 : cluster [DBG] 7.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63119360 unmapped: 835584 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 471722 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 97)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:18.407332+0000 osd.2 (osd.2) 96 : cluster [DBG] 7.1a scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:18.417911+0000 osd.2 (osd.2) 97 : cluster [DBG] 7.1a scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:50.216705+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63119360 unmapped: 835584 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:51.216828+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63127552 unmapped: 827392 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:52.216963+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63135744 unmapped: 819200 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:53.217105+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63143936 unmapped: 811008 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:54.217289+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:23.418645+0000 osd.2 (osd.2) 98 : cluster [DBG] 4.13 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:23.429196+0000 osd.2 (osd.2) 99 : cluster [DBG] 4.13 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 99)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:23.418645+0000 osd.2 (osd.2) 98 : cluster [DBG] 4.13 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:23.429196+0000 osd.2 (osd.2) 99 : cluster [DBG] 4.13 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63160320 unmapped: 794624 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 474135 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:55.217486+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63160320 unmapped: 794624 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:56.217785+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:25.473604+0000 osd.2 (osd.2) 100 : cluster [DBG] 3.5 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:25.484176+0000 osd.2 (osd.2) 101 : cluster [DBG] 3.5 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 101)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:25.473604+0000 osd.2 (osd.2) 100 : cluster [DBG] 3.5 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:25.484176+0000 osd.2 (osd.2) 101 : cluster [DBG] 3.5 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.860826492s of 10.089083672s, submitted: 10
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63168512 unmapped: 786432 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:57.218035+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:26.445208+0000 osd.2 (osd.2) 102 : cluster [DBG] 7.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:26.455854+0000 osd.2 (osd.2) 103 : cluster [DBG] 7.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 103)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:26.445208+0000 osd.2 (osd.2) 102 : cluster [DBG] 7.e scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:26.455854+0000 osd.2 (osd.2) 103 : cluster [DBG] 7.e scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63168512 unmapped: 786432 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:58.218296+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63176704 unmapped: 778240 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:59.218475+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63176704 unmapped: 778240 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 481368 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:00.218638+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:29.434275+0000 osd.2 (osd.2) 104 : cluster [DBG] 6.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:29.444869+0000 osd.2 (osd.2) 105 : cluster [DBG] 6.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 105)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:29.434275+0000 osd.2 (osd.2) 104 : cluster [DBG] 6.8 scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:29.444869+0000 osd.2 (osd.2) 105 : cluster [DBG] 6.8 scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63176704 unmapped: 778240 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:01.218854+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63184896 unmapped: 770048 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:02.219005+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:31.509926+0000 osd.2 (osd.2) 106 : cluster [DBG] 6.f scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:31.530908+0000 osd.2 (osd.2) 107 : cluster [DBG] 6.f scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 107)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:31.509926+0000 osd.2 (osd.2) 106 : cluster [DBG] 6.f scrub starts
Dec 03 21:34:37 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:31.530908+0000 osd.2 (osd.2) 107 : cluster [DBG] 6.f scrub ok
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63201280 unmapped: 753664 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:03.219205+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63209472 unmapped: 745472 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:04.219345+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63209472 unmapped: 745472 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:05.219501+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63209472 unmapped: 745472 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:06.219655+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63217664 unmapped: 737280 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:07.219993+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63217664 unmapped: 737280 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:08.220149+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63225856 unmapped: 729088 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:09.220310+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63234048 unmapped: 720896 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:10.220469+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63242240 unmapped: 712704 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:11.220672+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63242240 unmapped: 712704 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:12.220833+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63242240 unmapped: 712704 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:13.221011+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63250432 unmapped: 704512 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:14.221193+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63250432 unmapped: 704512 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:15.221683+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63275008 unmapped: 679936 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:16.221854+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63283200 unmapped: 671744 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:17.222052+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 663552 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:18.222168+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 663552 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:19.222371+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 663552 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:20.222707+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63299584 unmapped: 655360 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:21.222867+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63307776 unmapped: 647168 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:22.222985+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63307776 unmapped: 647168 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:23.223212+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63315968 unmapped: 638976 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:24.223417+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63315968 unmapped: 638976 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:25.223654+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63315968 unmapped: 638976 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:26.223929+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63324160 unmapped: 630784 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:27.224146+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63324160 unmapped: 630784 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:28.224424+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63332352 unmapped: 622592 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:29.224669+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63332352 unmapped: 622592 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:30.224846+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63332352 unmapped: 622592 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:31.224976+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63340544 unmapped: 614400 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:32.225176+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63348736 unmapped: 606208 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:33.225374+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63356928 unmapped: 598016 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:34.225613+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63356928 unmapped: 598016 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:35.225796+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63373312 unmapped: 581632 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:36.225990+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63381504 unmapped: 573440 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:37.226144+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63389696 unmapped: 565248 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:38.226328+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63389696 unmapped: 565248 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:39.226511+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63397888 unmapped: 557056 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:40.226687+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63406080 unmapped: 548864 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:41.226875+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63406080 unmapped: 548864 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:42.227028+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63422464 unmapped: 532480 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:43.227298+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63422464 unmapped: 532480 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:44.227541+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63430656 unmapped: 524288 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:45.227754+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63430656 unmapped: 524288 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:46.227947+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63438848 unmapped: 516096 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:47.228103+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63438848 unmapped: 516096 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:48.228255+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63438848 unmapped: 516096 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:49.228418+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63447040 unmapped: 507904 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:50.228563+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63447040 unmapped: 507904 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:51.228794+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63463424 unmapped: 491520 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:52.228961+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63463424 unmapped: 491520 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:53.229141+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63463424 unmapped: 491520 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:54.229291+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63471616 unmapped: 483328 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:55.229478+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63471616 unmapped: 483328 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:56.229634+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63479808 unmapped: 475136 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:57.229816+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63488000 unmapped: 466944 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:58.230014+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63488000 unmapped: 466944 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:59.230164+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63496192 unmapped: 458752 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:00.230382+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63496192 unmapped: 458752 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:01.230602+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63496192 unmapped: 458752 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:02.230777+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63504384 unmapped: 450560 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:03.230978+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63504384 unmapped: 450560 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:04.231132+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63512576 unmapped: 442368 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:05.231247+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63520768 unmapped: 434176 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:06.231377+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63520768 unmapped: 434176 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:07.231521+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:08.231692+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63528960 unmapped: 425984 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:09.231863+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63528960 unmapped: 425984 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:10.231998+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63528960 unmapped: 425984 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:11.232166+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63545344 unmapped: 409600 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:12.232293+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63545344 unmapped: 409600 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:13.232469+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63553536 unmapped: 401408 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:14.232589+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63553536 unmapped: 401408 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:15.232712+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63561728 unmapped: 393216 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:16.232922+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63578112 unmapped: 376832 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:17.233139+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63578112 unmapped: 376832 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:18.233297+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63586304 unmapped: 368640 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:19.233542+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63586304 unmapped: 368640 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:20.233839+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63594496 unmapped: 360448 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:21.234031+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63594496 unmapped: 360448 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:22.234161+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63594496 unmapped: 360448 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:23.234305+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 352256 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:24.234495+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 352256 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:25.234705+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 352256 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:26.234945+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63627264 unmapped: 327680 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:27.235153+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63627264 unmapped: 327680 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:28.235383+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63635456 unmapped: 319488 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:29.235674+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63635456 unmapped: 319488 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:30.235850+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63635456 unmapped: 319488 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:31.236095+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63651840 unmapped: 303104 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:32.236353+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63651840 unmapped: 303104 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:33.236630+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63651840 unmapped: 303104 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:34.236917+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63660032 unmapped: 294912 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:35.237081+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63660032 unmapped: 294912 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:36.237268+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63684608 unmapped: 270336 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:37.237434+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63684608 unmapped: 270336 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:38.237654+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63692800 unmapped: 262144 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:39.237834+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63692800 unmapped: 262144 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:40.237980+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63692800 unmapped: 262144 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:41.238135+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63700992 unmapped: 253952 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:42.238272+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63700992 unmapped: 253952 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:43.238466+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63709184 unmapped: 245760 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:44.238680+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63725568 unmapped: 229376 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:45.238820+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63725568 unmapped: 229376 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:46.238976+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63733760 unmapped: 221184 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:47.239109+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63733760 unmapped: 221184 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:48.239254+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63733760 unmapped: 221184 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:49.239515+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63741952 unmapped: 212992 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:50.239728+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63741952 unmapped: 212992 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:51.239891+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63733760 unmapped: 221184 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:52.240102+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63741952 unmapped: 212992 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:53.240327+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63741952 unmapped: 212992 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:54.240508+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63750144 unmapped: 204800 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:55.240675+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63750144 unmapped: 204800 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:56.240836+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63766528 unmapped: 188416 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:57.240964+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63774720 unmapped: 180224 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:58.241119+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63791104 unmapped: 163840 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:59.241262+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63799296 unmapped: 155648 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:00.241395+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63799296 unmapped: 155648 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:01.241523+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63807488 unmapped: 147456 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:02.241676+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63807488 unmapped: 147456 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:03.241840+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63807488 unmapped: 147456 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:04.241974+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63815680 unmapped: 139264 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:05.242090+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63815680 unmapped: 139264 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:06.242263+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63823872 unmapped: 131072 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:07.242432+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63832064 unmapped: 122880 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:08.242600+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63832064 unmapped: 122880 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:09.242797+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63840256 unmapped: 114688 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:10.242944+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63840256 unmapped: 114688 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:11.243138+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63848448 unmapped: 106496 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:12.243268+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63848448 unmapped: 106496 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:13.243691+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63848448 unmapped: 106496 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:14.243902+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63856640 unmapped: 98304 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:15.244047+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63856640 unmapped: 98304 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:16.244272+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63856640 unmapped: 98304 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:17.244477+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63864832 unmapped: 90112 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:18.244650+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63864832 unmapped: 90112 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:19.244783+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63873024 unmapped: 81920 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:20.244930+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63873024 unmapped: 81920 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:21.245081+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63873024 unmapped: 81920 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:22.245228+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63881216 unmapped: 73728 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:23.245491+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63881216 unmapped: 73728 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:24.245702+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63881216 unmapped: 73728 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:25.245890+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63889408 unmapped: 65536 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:26.246133+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63889408 unmapped: 65536 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:27.246408+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63897600 unmapped: 57344 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:28.246623+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63897600 unmapped: 57344 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:29.246969+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63897600 unmapped: 57344 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:30.247151+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63905792 unmapped: 49152 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:31.247377+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63913984 unmapped: 40960 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:32.247684+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63922176 unmapped: 32768 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:33.247911+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63922176 unmapped: 32768 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:34.248176+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63930368 unmapped: 24576 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:35.248424+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63930368 unmapped: 24576 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:36.248693+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63930368 unmapped: 24576 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:37.248884+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63938560 unmapped: 16384 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:38.249145+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63938560 unmapped: 16384 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:39.249427+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63946752 unmapped: 8192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:40.249688+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63946752 unmapped: 8192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:41.249995+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63946752 unmapped: 8192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:42.250297+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63954944 unmapped: 0 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:43.250631+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63954944 unmapped: 0 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:44.250800+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63963136 unmapped: 1040384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:45.251004+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63963136 unmapped: 1040384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:46.251179+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63971328 unmapped: 1032192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:47.251311+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63971328 unmapped: 1032192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:48.251462+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63971328 unmapped: 1032192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:49.251696+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63979520 unmapped: 1024000 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:50.251947+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63979520 unmapped: 1024000 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:51.252205+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63979520 unmapped: 1024000 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:52.252499+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63995904 unmapped: 1007616 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:53.252779+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63995904 unmapped: 1007616 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:54.252984+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64004096 unmapped: 999424 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:55.253201+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64004096 unmapped: 999424 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:56.253413+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64004096 unmapped: 999424 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:57.253636+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64012288 unmapped: 991232 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:58.253796+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64012288 unmapped: 991232 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:59.254003+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64020480 unmapped: 983040 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:00.254185+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64020480 unmapped: 983040 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:01.254369+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64020480 unmapped: 983040 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:02.254513+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64028672 unmapped: 974848 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:03.254670+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64028672 unmapped: 974848 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:04.254872+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64036864 unmapped: 966656 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:05.255034+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64036864 unmapped: 966656 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:06.255194+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64036864 unmapped: 966656 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:07.255358+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64045056 unmapped: 958464 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:08.255547+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64045056 unmapped: 958464 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:09.255697+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64045056 unmapped: 958464 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:10.255847+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64053248 unmapped: 950272 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:11.255998+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64053248 unmapped: 950272 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:12.256145+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64061440 unmapped: 942080 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:13.256315+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64061440 unmapped: 942080 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:14.256493+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:15.256647+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:16.256835+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64061440 unmapped: 942080 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:17.256982+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:18.257119+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:19.257235+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:20.257389+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64077824 unmapped: 925696 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:21.257548+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64077824 unmapped: 925696 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:22.257692+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64086016 unmapped: 917504 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:23.257910+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64086016 unmapped: 917504 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:24.258098+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64094208 unmapped: 909312 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:25.258221+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64094208 unmapped: 909312 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:26.258364+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64102400 unmapped: 901120 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:27.258498+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64102400 unmapped: 901120 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:28.258633+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64102400 unmapped: 901120 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:29.258794+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64110592 unmapped: 892928 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:30.258944+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64110592 unmapped: 892928 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:31.259126+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64110592 unmapped: 892928 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:32.259316+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64118784 unmapped: 884736 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:33.259494+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64118784 unmapped: 884736 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:34.259657+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64126976 unmapped: 876544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:35.259847+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64126976 unmapped: 876544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:36.259996+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64126976 unmapped: 876544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:37.260189+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64135168 unmapped: 868352 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:38.260369+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64135168 unmapped: 868352 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:39.260516+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64143360 unmapped: 860160 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:40.260669+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64143360 unmapped: 860160 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:41.260844+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64143360 unmapped: 860160 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:42.260986+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64151552 unmapped: 851968 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:43.261161+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64151552 unmapped: 851968 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:44.261302+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64159744 unmapped: 843776 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:45.261522+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64159744 unmapped: 843776 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:46.285707+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64159744 unmapped: 843776 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:47.285856+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64167936 unmapped: 835584 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:48.285961+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64167936 unmapped: 835584 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:49.286122+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64167936 unmapped: 835584 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:50.286264+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64176128 unmapped: 827392 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:51.286373+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64176128 unmapped: 827392 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:52.286482+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64184320 unmapped: 819200 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:53.286664+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64184320 unmapped: 819200 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:54.286814+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64184320 unmapped: 819200 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:55.286939+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64192512 unmapped: 811008 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:56.287073+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64192512 unmapped: 811008 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:57.287214+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64192512 unmapped: 811008 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:58.287330+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64200704 unmapped: 802816 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:59.287484+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64200704 unmapped: 802816 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:00.287744+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64208896 unmapped: 794624 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:01.287995+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64208896 unmapped: 794624 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:02.288137+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64217088 unmapped: 786432 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:03.288410+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64217088 unmapped: 786432 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:04.288563+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64217088 unmapped: 786432 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:05.288739+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64225280 unmapped: 778240 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:06.288923+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64225280 unmapped: 778240 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:07.289100+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64233472 unmapped: 770048 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:08.289264+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64233472 unmapped: 770048 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:09.289390+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 761856 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:10.289584+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 761856 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:11.289734+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 761856 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:12.289889+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64249856 unmapped: 753664 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:13.290935+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64249856 unmapped: 753664 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:14.291094+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64249856 unmapped: 753664 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:15.291205+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64258048 unmapped: 745472 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:16.291356+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64258048 unmapped: 745472 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:17.291498+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64266240 unmapped: 737280 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:18.291656+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64266240 unmapped: 737280 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:19.291791+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64266240 unmapped: 737280 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:20.291925+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64274432 unmapped: 729088 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:21.292086+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64274432 unmapped: 729088 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:22.292317+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64274432 unmapped: 729088 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:23.292493+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64282624 unmapped: 720896 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:24.292636+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64282624 unmapped: 720896 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:25.292813+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64290816 unmapped: 712704 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:26.292960+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64290816 unmapped: 712704 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:27.293134+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64290816 unmapped: 712704 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:28.293271+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64299008 unmapped: 704512 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:29.293460+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64299008 unmapped: 704512 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:30.293607+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64299008 unmapped: 704512 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:31.293738+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64307200 unmapped: 696320 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:32.293856+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64307200 unmapped: 696320 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:33.293996+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64315392 unmapped: 688128 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:34.294150+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64315392 unmapped: 688128 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:35.294330+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64323584 unmapped: 679936 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:36.294498+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64323584 unmapped: 679936 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:37.294677+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64323584 unmapped: 679936 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:38.294815+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64331776 unmapped: 671744 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:39.294948+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64331776 unmapped: 671744 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:40.295083+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64339968 unmapped: 663552 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:41.295203+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64339968 unmapped: 663552 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:42.295344+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64339968 unmapped: 663552 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:43.295565+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64348160 unmapped: 655360 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:44.295800+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64348160 unmapped: 655360 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:45.295951+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64348160 unmapped: 655360 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:46.296159+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64356352 unmapped: 647168 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:47.296316+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64356352 unmapped: 647168 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:48.296512+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64364544 unmapped: 638976 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:49.296732+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64364544 unmapped: 638976 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:50.296864+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64364544 unmapped: 638976 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:51.296977+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64372736 unmapped: 630784 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:52.297084+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64372736 unmapped: 630784 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:53.297221+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64372736 unmapped: 630784 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:54.297447+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64380928 unmapped: 622592 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:55.297659+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64380928 unmapped: 622592 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:56.297811+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64389120 unmapped: 614400 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:57.298001+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64389120 unmapped: 614400 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:58.298115+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64397312 unmapped: 606208 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:59.298321+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64397312 unmapped: 606208 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:00.298479+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64397312 unmapped: 606208 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:01.298698+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64405504 unmapped: 598016 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:02.298873+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64405504 unmapped: 598016 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:03.299077+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64405504 unmapped: 598016 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:04.299321+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:05.299491+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:06.299666+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:07.299876+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 581632 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:08.300044+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 581632 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:09.300254+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64430080 unmapped: 573440 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:10.300459+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64430080 unmapped: 573440 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:11.300665+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64430080 unmapped: 573440 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:12.300830+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64438272 unmapped: 565248 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:13.301021+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64438272 unmapped: 565248 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:14.301189+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64446464 unmapped: 557056 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:15.301373+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64446464 unmapped: 557056 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:16.301507+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64454656 unmapped: 548864 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:17.301705+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64454656 unmapped: 548864 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:18.301851+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64454656 unmapped: 548864 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:19.301972+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64462848 unmapped: 540672 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:20.302088+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64471040 unmapped: 532480 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:21.302202+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64471040 unmapped: 532480 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:22.302343+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64479232 unmapped: 524288 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:23.302884+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64479232 unmapped: 524288 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:24.303018+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64479232 unmapped: 524288 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:25.303190+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64487424 unmapped: 516096 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:26.303332+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64487424 unmapped: 516096 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:27.303464+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64495616 unmapped: 507904 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:28.303603+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64495616 unmapped: 507904 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:29.303729+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64503808 unmapped: 499712 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:30.303864+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64503808 unmapped: 499712 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:31.304072+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64512000 unmapped: 491520 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:32.304225+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64512000 unmapped: 491520 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:33.304428+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64512000 unmapped: 491520 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:34.304638+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64520192 unmapped: 483328 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:35.304781+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64520192 unmapped: 483328 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:36.304946+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64528384 unmapped: 475136 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:37.305129+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64528384 unmapped: 475136 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:38.305252+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64528384 unmapped: 475136 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:39.305382+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64536576 unmapped: 466944 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:40.305550+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64536576 unmapped: 466944 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:41.305781+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64536576 unmapped: 466944 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:42.305958+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64544768 unmapped: 458752 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:43.306159+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64544768 unmapped: 458752 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:44.306316+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64552960 unmapped: 450560 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:45.306521+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64552960 unmapped: 450560 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:46.306642+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64552960 unmapped: 450560 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:47.306799+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64561152 unmapped: 442368 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:48.306949+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64561152 unmapped: 442368 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:49.307152+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64569344 unmapped: 434176 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:50.307302+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64569344 unmapped: 434176 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:51.307514+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64569344 unmapped: 434176 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:52.307655+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64577536 unmapped: 425984 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:53.307832+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64577536 unmapped: 425984 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:54.307999+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64585728 unmapped: 417792 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:55.308190+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64585728 unmapped: 417792 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:56.308397+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64585728 unmapped: 417792 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:57.308714+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64593920 unmapped: 409600 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:58.308881+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64593920 unmapped: 409600 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:59.309097+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 401408 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:00.309220+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:01.309334+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 401408 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:02.309607+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 401408 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:03.309796+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64610304 unmapped: 393216 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:04.309928+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64610304 unmapped: 393216 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:05.310048+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64618496 unmapped: 385024 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:06.310160+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64618496 unmapped: 385024 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:07.310441+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64618496 unmapped: 385024 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:08.310649+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64626688 unmapped: 376832 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:09.310819+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64626688 unmapped: 376832 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:10.310969+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64634880 unmapped: 368640 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:11.311133+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64634880 unmapped: 368640 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:12.311285+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64634880 unmapped: 368640 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:13.311491+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64643072 unmapped: 360448 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:14.311685+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64643072 unmapped: 360448 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:15.311841+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64643072 unmapped: 360448 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:16.312011+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64651264 unmapped: 352256 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:17.312120+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64651264 unmapped: 352256 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:18.312281+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64659456 unmapped: 344064 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:19.312438+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64659456 unmapped: 344064 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:20.312513+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64659456 unmapped: 344064 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:21.312661+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64667648 unmapped: 335872 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:22.312783+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64667648 unmapped: 335872 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:23.312928+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64675840 unmapped: 327680 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:24.313170+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64675840 unmapped: 327680 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:25.313363+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64675840 unmapped: 327680 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:26.313502+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64684032 unmapped: 319488 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:27.313685+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64684032 unmapped: 319488 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:28.313809+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64684032 unmapped: 319488 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:29.313934+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64692224 unmapped: 311296 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:30.314309+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64692224 unmapped: 311296 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:31.314469+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64692224 unmapped: 311296 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:32.314652+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64692224 unmapped: 311296 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:33.314871+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64700416 unmapped: 303104 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:34.315113+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64700416 unmapped: 303104 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:35.315239+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64700416 unmapped: 303104 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:36.315383+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64708608 unmapped: 294912 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:37.315554+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64708608 unmapped: 294912 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:38.315737+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64716800 unmapped: 286720 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:39.315899+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64716800 unmapped: 286720 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:40.316093+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64716800 unmapped: 286720 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:41.316272+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64724992 unmapped: 278528 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:42.316436+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64724992 unmapped: 278528 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:43.316674+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64724992 unmapped: 278528 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:44.316899+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64733184 unmapped: 270336 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:45.317059+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64733184 unmapped: 270336 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:46.317193+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64733184 unmapped: 270336 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:47.317388+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64741376 unmapped: 262144 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:48.317541+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64741376 unmapped: 262144 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 16.19 MB, 0.03 MB/s
                                           Interval WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:49.317647+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64806912 unmapped: 196608 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:50.317780+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64806912 unmapped: 196608 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:51.318036+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64815104 unmapped: 188416 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:52.318313+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64815104 unmapped: 188416 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:53.318762+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64815104 unmapped: 188416 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:54.318927+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64823296 unmapped: 180224 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:55.319419+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64823296 unmapped: 180224 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:56.319629+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64831488 unmapped: 172032 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:57.320022+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64831488 unmapped: 172032 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:58.320160+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64839680 unmapped: 163840 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:59.320361+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64839680 unmapped: 163840 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:00.320631+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64839680 unmapped: 163840 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:01.320885+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64847872 unmapped: 155648 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:02.321111+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64847872 unmapped: 155648 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:03.321336+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64847872 unmapped: 155648 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:04.321743+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64856064 unmapped: 147456 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:05.322130+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64856064 unmapped: 147456 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:06.322616+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64864256 unmapped: 139264 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:07.322942+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64864256 unmapped: 139264 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:08.323180+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64864256 unmapped: 139264 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:09.323454+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64872448 unmapped: 131072 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:10.323873+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64872448 unmapped: 131072 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:11.324231+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64880640 unmapped: 122880 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:12.324538+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64880640 unmapped: 122880 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:13.324929+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64880640 unmapped: 122880 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:14.325237+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64888832 unmapped: 114688 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:15.325679+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64888832 unmapped: 114688 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:16.325900+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64897024 unmapped: 106496 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:17.326133+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64897024 unmapped: 106496 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:18.326383+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64897024 unmapped: 106496 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:19.326749+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64905216 unmapped: 98304 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:20.327026+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64905216 unmapped: 98304 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:21.327327+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64921600 unmapped: 81920 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:22.327736+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64921600 unmapped: 81920 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:23.328077+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64921600 unmapped: 81920 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:24.328340+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64929792 unmapped: 73728 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:25.328707+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64929792 unmapped: 73728 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:26.328931+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64937984 unmapped: 65536 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:27.329189+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64937984 unmapped: 65536 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:28.329428+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64937984 unmapped: 65536 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:29.329696+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64946176 unmapped: 57344 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:30.330020+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64946176 unmapped: 57344 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:31.330305+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64946176 unmapped: 57344 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:32.330641+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64954368 unmapped: 49152 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:33.330922+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64954368 unmapped: 49152 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:34.331191+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64962560 unmapped: 40960 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:35.331372+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64962560 unmapped: 40960 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:36.331519+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64970752 unmapped: 32768 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:37.331723+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64970752 unmapped: 32768 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:38.331976+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64970752 unmapped: 32768 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:39.332255+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64978944 unmapped: 24576 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:40.332669+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64978944 unmapped: 24576 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:41.332946+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64987136 unmapped: 16384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:42.333236+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64987136 unmapped: 16384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:43.334002+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64987136 unmapped: 16384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:44.334245+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64995328 unmapped: 8192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:45.334515+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64995328 unmapped: 8192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:46.334950+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65003520 unmapped: 0 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:47.335199+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65003520 unmapped: 0 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:48.335400+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65003520 unmapped: 0 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:49.335725+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65011712 unmapped: 1040384 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:50.335975+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65011712 unmapped: 1040384 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:51.336267+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65019904 unmapped: 1032192 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:52.336590+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65028096 unmapped: 1024000 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:53.337000+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65028096 unmapped: 1024000 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:54.338188+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65036288 unmapped: 1015808 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:55.339313+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65036288 unmapped: 1015808 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:56.339618+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65036288 unmapped: 1015808 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:57.339810+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65044480 unmapped: 1007616 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:58.339977+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65044480 unmapped: 1007616 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:59.340141+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65044480 unmapped: 1007616 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:00.340289+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65052672 unmapped: 999424 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:01.340438+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65060864 unmapped: 991232 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:02.340560+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65060864 unmapped: 991232 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:03.340844+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65060864 unmapped: 991232 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:04.341014+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65069056 unmapped: 983040 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:05.341159+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65069056 unmapped: 983040 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:06.341316+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65069056 unmapped: 983040 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:07.341523+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65077248 unmapped: 974848 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:08.341641+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65085440 unmapped: 966656 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:09.341782+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65085440 unmapped: 966656 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:10.341964+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65085440 unmapped: 966656 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:11.342132+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65093632 unmapped: 958464 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:12.342397+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65093632 unmapped: 958464 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:13.342637+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65101824 unmapped: 950272 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:14.342808+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65101824 unmapped: 950272 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:15.342993+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65101824 unmapped: 950272 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:16.343153+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65110016 unmapped: 942080 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:17.343311+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65110016 unmapped: 942080 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:18.343446+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65110016 unmapped: 942080 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:19.343590+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:20.343767+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:21.343932+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:22.344079+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:23.344251+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:24.344411+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65126400 unmapped: 925696 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:25.344668+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65126400 unmapped: 925696 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:26.345011+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65134592 unmapped: 917504 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:27.345149+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65134592 unmapped: 917504 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:28.345358+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65134592 unmapped: 917504 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:29.345543+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65142784 unmapped: 909312 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:30.345730+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65142784 unmapped: 909312 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:31.345889+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65150976 unmapped: 901120 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:32.346047+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65150976 unmapped: 901120 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:33.346207+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65150976 unmapped: 901120 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:34.346354+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65159168 unmapped: 892928 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:35.346493+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65159168 unmapped: 892928 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:36.346709+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:37.346936+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:38.347061+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:39.377242+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:40.377378+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:41.377535+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:42.377946+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:43.378662+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:44.378834+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:45.378986+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:46.379711+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:47.379871+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:48.380059+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:49.380191+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:50.380357+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:51.380527+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:52.380684+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:53.381637+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:54.382020+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:55.382769+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:56.382996+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:57.383314+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:58.383533+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:59.383675+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:00.383854+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:01.383992+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:02.384146+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:03.384417+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:04.384601+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:05.384734+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:06.384892+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:07.385084+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:08.385252+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:09.385425+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:10.385733+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:11.386048+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:12.386244+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:13.386543+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:14.386641+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:15.386902+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:16.387044+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:17.387212+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:18.387417+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:19.387634+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:20.387767+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:21.387923+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:22.388113+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:23.388323+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:24.388453+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:25.388626+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:26.388841+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:27.389109+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:28.389371+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:29.389681+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:30.389972+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:31.390212+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:32.390430+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:33.390749+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:34.391000+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:35.391229+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:36.391487+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:37.391646+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:38.391801+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:39.392019+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:40.392260+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:41.392471+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:42.392696+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:43.392946+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:44.393130+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:45.393318+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:46.393494+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:47.393627+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:48.393990+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:49.394157+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:50.394362+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:51.394722+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:52.394898+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:53.395091+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:54.395242+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:55.395430+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:56.395644+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:57.395805+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:58.396010+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:59.396226+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:00.396476+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:01.396782+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:02.396950+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:03.397130+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:04.397336+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:05.397506+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:06.397655+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:07.397805+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:08.397935+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:09.398048+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:10.398163+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:11.398382+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:12.398542+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:13.398779+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:14.398900+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:15.399033+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:16.399187+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:17.399322+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:18.399477+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:19.399622+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:20.399761+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:21.399908+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:22.400027+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:23.400196+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:24.400362+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:25.400514+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:26.400701+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:27.400803+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:28.400940+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:29.401140+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:30.401384+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:31.401680+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:32.402027+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:33.402201+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:34.402351+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:35.402491+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:36.402643+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:37.402826+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:38.402937+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:39.403172+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:40.403425+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:41.403642+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:42.403791+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:43.403996+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:44.404178+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:45.404767+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:46.405253+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:47.405448+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:48.405631+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:49.405884+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:50.406039+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:51.406225+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:52.406376+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:53.406634+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:54.406875+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:55.407032+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:56.407222+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:57.407383+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:58.407519+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:59.407683+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:00.407836+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:01.407972+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:02.408134+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:03.408431+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:04.408620+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:05.408758+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:06.408899+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:07.409025+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:08.409273+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:09.409499+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:10.409641+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:11.409782+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:12.409942+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:13.410319+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:14.410483+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:15.410651+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:16.410832+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:17.410990+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:18.411128+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:19.411291+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:20.411454+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:21.411658+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:22.411900+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:23.412156+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:24.412358+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:25.412536+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:26.412695+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:27.412851+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:28.413021+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:29.413179+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:30.413332+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:31.413503+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:32.413688+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:33.413896+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:34.414102+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:35.414306+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:36.414562+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:37.414818+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:38.414976+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:39.415192+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:40.415349+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:41.415536+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:42.415644+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:43.415822+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:44.416273+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:45.416484+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:46.416642+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:47.416754+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:48.417010+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:49.417158+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:50.417318+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:51.417460+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:52.417600+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:53.418475+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:54.418666+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:55.418862+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:56.419009+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:57.419127+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:58.419285+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:59.419412+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:00.419606+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:01.419753+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:02.419905+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:03.420075+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:04.420202+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:05.420374+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:06.420530+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:07.420794+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:08.421056+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:09.421261+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14752 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:37 compute-0 ceph-mon[75204]: pgmap v888: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:37 compute-0 ceph-mon[75204]: from='client.14742 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:37 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4143601676' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 03 21:34:37 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2866941577' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 03 21:34:37 compute-0 ceph-mon[75204]: from='client.14748 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:10.421456+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:11.421636+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:12.421823+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:13.422003+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:14.422200+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:15.422398+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:16.422562+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:17.422725+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:18.422981+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:19.423210+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:20.432197+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:21.432446+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:22.432653+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:23.432869+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:24.433053+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:25.433188+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:26.433520+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:27.433781+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:28.434017+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:29.434183+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:30.434323+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:31.434535+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:32.434759+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:33.434956+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:34.435136+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:35.435443+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:36.435643+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:37.435952+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:38.436079+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:39.436236+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:40.436448+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:41.436690+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:42.436830+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:43.437123+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:44.437244+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:45.437399+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:46.437643+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:47.437837+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:48.438009+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:49.438225+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:50.438452+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:51.438669+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:52.438800+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:53.438983+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:54.439141+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:55.439362+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:56.439495+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:57.439616+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:58.439776+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:59.439894+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:00.440069+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:01.440336+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:02.440508+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:03.440774+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:04.440981+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:05.441180+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:06.441340+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:07.441527+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:08.441704+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:09.441886+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:10.442023+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:11.442269+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:12.442461+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:13.442650+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:14.442831+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:15.442979+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:16.443185+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:17.443332+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:18.443497+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:19.443632+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:20.443780+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:21.443944+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:22.444068+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:23.444247+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:24.444402+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:25.444633+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:26.444792+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:27.444953+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:28.445150+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:29.445276+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:30.445467+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:31.445645+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:32.445805+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:33.445995+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:34.446139+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:35.446313+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:36.446545+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:37.446673+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:38.446823+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:39.446979+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:40.447145+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:41.447387+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:42.447547+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:43.447777+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:44.447965+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:45.448124+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:46.448356+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:47.448525+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:48.448670+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:49.448795+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:50.448956+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:51.449131+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:52.449315+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:53.449499+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:54.449674+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:55.449808+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:56.450019+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:57.450139+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:58.450324+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:59.450507+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:00.450757+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:01.450913+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:02.451071+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:03.451278+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:04.451424+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:05.451763+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:06.451922+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:07.452079+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:08.452243+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:09.452380+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:10.452611+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:11.452782+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:12.452957+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:13.453154+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:14.453309+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:15.453498+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:16.453646+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:17.453770+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:18.453952+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:19.454093+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:20.454239+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:21.454398+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:22.454523+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:23.454657+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:24.454780+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:25.454905+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:26.455087+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:27.455205+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:28.455305+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:29.455497+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:30.455643+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:31.455847+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:32.455966+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:33.456201+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:34.456360+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:35.456551+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:36.456741+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:37.456915+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:38.457046+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:39.457227+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:40.457390+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:41.457545+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:42.457630+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:43.457796+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:44.457936+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:45.458097+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:46.458319+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:47.458451+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:48.458663+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:49.458845+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:50.459081+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:51.459347+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:52.459482+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:53.459692+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:54.459857+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:55.460025+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:56.460238+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:57.460449+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:58.460676+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:59.460811+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:00.461012+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:01.461262+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:02.461421+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:03.461646+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:04.461866+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:05.462089+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:06.462263+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:07.462473+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:08.462654+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:09.462823+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:10.462954+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:11.463169+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:12.463381+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:13.463603+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:14.463803+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:15.463950+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:16.464114+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:17.464356+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:18.464519+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:19.464652+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:20.464829+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:21.465012+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:22.465206+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:23.465396+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:24.465521+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:25.465719+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:26.465932+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:27.466119+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:28.466323+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:29.466541+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:30.466710+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:31.466866+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:32.467031+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:33.467254+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:34.467431+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:35.467669+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:36.467867+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:37.468062+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:38.468216+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:39.468403+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:40.468546+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:41.468768+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:42.468925+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:43.469097+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:44.469286+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:45.469460+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:46.469661+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:47.469830+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:48.469975+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:49.470136+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:50.470349+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:51.470624+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:52.470851+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:53.471054+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:54.471229+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:55.471389+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:56.471551+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:57.471757+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:58.471910+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:59.472028+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:00.472172+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:01.472325+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:02.472504+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:03.472735+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:04.472860+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:05.473039+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:06.473229+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:07.473390+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:08.473631+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:09.473840+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:10.474002+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:11.474155+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:12.474335+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:13.474554+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:14.474769+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:15.474946+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:16.475218+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:17.475397+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:18.475671+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:19.475959+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:20.476198+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:21.476487+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:22.476862+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:23.477230+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:24.477520+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:25.477817+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:26.478047+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:27.478304+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:28.478548+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:29.478863+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:30.479105+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:31.479276+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:32.479440+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:33.479605+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:34.479773+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:35.479921+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:36.480104+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:37.480307+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:38.480680+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:39.480804+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:40.481012+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:41.481247+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:42.481471+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:43.481648+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:44.481793+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:45.481941+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:46.482112+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:47.482276+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:48.482474+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:49.482676+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65347584 unmapped: 704512 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:50.482862+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65347584 unmapped: 704512 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:51.483003+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:52.483165+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:53.483387+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:54.483685+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:55.484379+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:56.484801+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:57.484972+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:58.485823+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:59.486466+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:00.486895+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:01.487279+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:02.487553+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:03.488249+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:04.488781+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:05.489233+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:06.489648+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:07.490064+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:08.490351+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:09.490700+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:10.490970+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:11.491238+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:12.491453+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:13.491755+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:14.491985+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:15.492276+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:16.492516+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:17.492689+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:18.492949+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:19.493137+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:20.493306+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:21.493497+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:22.493703+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:23.493910+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:24.494117+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:25.494300+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:26.494456+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:27.494618+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:28.494957+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:29.495240+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:30.495398+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:31.495601+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:32.495789+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:33.496080+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:34.496551+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:35.496935+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:36.497396+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:37.497545+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:38.497845+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:39.498052+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:40.498200+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:41.498383+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:42.498548+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:43.498760+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:44.498946+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:45.499140+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:46.499307+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:47.499500+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:48.499653+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:49.499887+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:50.500029+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:51.500339+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:52.500505+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:53.500805+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:54.500991+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:55.501159+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:56.501341+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:57.501525+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:58.501704+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:59.501895+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:00.502046+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:01.502292+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:02.502503+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:03.502697+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:04.502875+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:05.537111+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:06.537290+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:07.537509+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:08.537781+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:09.538011+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:10.538253+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:11.538482+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:12.538659+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:13.538891+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:14.539072+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:15.539355+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:16.539536+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:17.540042+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:18.540282+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08fd3c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:19.540451+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 62 handle_osd_map epochs [62,63], i have 62, src has [1,63]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1103.114379883s of 1103.126831055s, submitted: 6
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65503232 unmapped: 548864 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:20.540647+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65617920 unmapped: 17219584 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:21.540831+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 64 handle_osd_map epochs [64,65], i have 64, src has [1,65]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 65 ms_handle_reset con 0x559f08fd3c00 session 0x559f09d6b500
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 17137664 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:22.541033+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 65 heartbeat osd_stat(store_statfs(0x4fd914000/0x0/0x4ffc00000, data 0x84bbf8/0x8b6000, compress 0x0/0x0/0x0, omap 0x85f8, meta 0x1a27a08), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 17137664 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:23.541282+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09d9a800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 539076 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 17006592 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:24.541527+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 65 handle_osd_map epochs [65,66], i have 65, src has [1,66]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 66 ms_handle_reset con 0x559f09d9a800 session 0x559f0b0ea380
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:25.541691+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:26.541837+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:27.542036+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:28.542222+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 66 heartbeat osd_stat(store_statfs(0x4fd910000/0x0/0x4ffc00000, data 0x84d204/0x8ba000, compress 0x0/0x0/0x0, omap 0x8924, meta 0x1a276dc), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 543588 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:29.542536+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:30.542751+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 66 heartbeat osd_stat(store_statfs(0x4fd910000/0x0/0x4ffc00000, data 0x84d204/0x8ba000, compress 0x0/0x0/0x0, omap 0x8924, meta 0x1a276dc), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:31.542959+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:32.543168+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:33.543334+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 543588 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:34.543517+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:35.543728+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.887508392s of 16.273990631s, submitted: 31
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:36.543938+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:37.544198+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:38.544361+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:39.544597+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:40.544803+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:41.545010+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:42.545179+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:43.545390+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:44.545541+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:45.545695+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:46.545898+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:47.546137+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:48.546389+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:49.546632+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:50.546752+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:51.546974+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:52.547145+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:53.547467+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:54.547655+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:55.547832+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:56.548022+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:57.548268+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:58.548482+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:59.548754+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:00.548902+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:01.549136+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.484739304s of 26.491054535s, submitted: 13
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:02.549344+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66101248 unmapped: 16736256 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 68 ms_handle_reset con 0x559f0b2db800 session 0x559f0b1041c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:03.549563+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 16687104 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 552517 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:04.549765+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 16687104 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 68 heartbeat osd_stat(store_statfs(0x4fd908000/0x0/0x4ffc00000, data 0x8500a4/0x8c2000, compress 0x0/0x0/0x0, omap 0x8f21, meta 0x1a270df), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:05.549921+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66248704 unmapped: 16588800 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:06.550079+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 24756224 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:07.550238+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 24715264 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 69 ms_handle_reset con 0x559f0b2db400 session 0x559f0b09f6c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fc10a000/0x0/0x4ffc00000, data 0x20500a4/0x20c2000, compress 0x0/0x0/0x0, omap 0x9267, meta 0x1a26d99), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:08.550372+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 24690688 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 70 ms_handle_reset con 0x559f0b2dbc00 session 0x559f09d6a540
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 563102 data_alloc: 218103808 data_used: 666
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08fd3c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:09.550511+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 23683072 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 70 heartbeat osd_stat(store_statfs(0x4fc105000/0x0/0x4ffc00000, data 0x2051671/0x20c5000, compress 0x0/0x0/0x0, omap 0x95e7, meta 0x1a26a19), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 70 handle_osd_map epochs [70,71], i have 70, src has [1,71]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 71 ms_handle_reset con 0x559f08fd3c00 session 0x559f09ce7180
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 71 ms_handle_reset con 0x559f0b2db400 session 0x559f0b0b1dc0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 71 heartbeat osd_stat(store_statfs(0x4fd8fe000/0x0/0x4ffc00000, data 0x852cc5/0x8ca000, compress 0x0/0x0/0x0, omap 0x9dc1, meta 0x1a2623f), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:10.550678+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67592192 unmapped: 23642112 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 71 ms_handle_reset con 0x559f0b2db800 session 0x559f0b0c9c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 71 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b0c8380
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:11.550860+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 23379968 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcb800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:12.550999+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 71 heartbeat osd_stat(store_statfs(0x4fd8fe000/0x0/0x4ffc00000, data 0x853ec3/0x8cb000, compress 0x0/0x0/0x0, omap 0xa06d, meta 0x1a25f93), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 23240704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 71 handle_osd_map epochs [72,72], i have 71, src has [1,72]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.315886497s of 10.602708817s, submitted: 111
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 72 ms_handle_reset con 0x559f09dcb800 session 0x559f0b0c8fc0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08fd3c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 72 ms_handle_reset con 0x559f08fd3c00 session 0x559f0af396c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 72 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b079a40
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:13.551225+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 23543808 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 572170 data_alloc: 218103808 data_used: 4743
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:14.551361+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 23289856 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 72 handle_osd_map epochs [72,73], i have 72, src has [1,73]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 73 ms_handle_reset con 0x559f08567c00 session 0x559f0b078c40
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 73 ms_handle_reset con 0x559f08566c00 session 0x559f09cc2700
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:15.551535+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af3bc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 22831104 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 73 handle_osd_map epochs [73,74], i have 73, src has [1,74]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 74 ms_handle_reset con 0x559f0af3bc00 session 0x559f09cc3a40
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 74 ms_handle_reset con 0x559f08566400 session 0x559f0b05cc40
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:16.551645+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 22675456 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:17.551763+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 22675456 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fd8f1000/0x0/0x4ffc00000, data 0x8595b7/0x8d5000, compress 0x0/0x0/0x0, omap 0xb3a3, meta 0x1a24c5d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:18.551975+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 22675456 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fd8f1000/0x0/0x4ffc00000, data 0x8595b7/0x8d5000, compress 0x0/0x0/0x0, omap 0xb3a3, meta 0x1a24c5d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 580126 data_alloc: 218103808 data_used: 4727
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:19.552165+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68591616 unmapped: 22642688 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af3bc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:20.552375+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 22831104 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 75 ms_handle_reset con 0x559f0af3bc00 session 0x559f0977ea80
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:21.552549+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 22831104 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:22.552811+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 22773760 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.737900734s of 10.061096191s, submitted: 136
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 77 ms_handle_reset con 0x559f08566c00 session 0x559f0977f880
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:23.552981+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 22659072 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 77 handle_osd_map epochs [77,78], i have 77, src has [1,78]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 78 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b0c9340
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 78 heartbeat osd_stat(store_statfs(0x4fd8eb000/0x0/0x4ffc00000, data 0x85d114/0x8df000, compress 0x0/0x0/0x0, omap 0xc01d, meta 0x1a23fe3), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 602228 data_alloc: 218103808 data_used: 12849
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:24.553181+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 78 heartbeat osd_stat(store_statfs(0x4fd8eb000/0x0/0x4ffc00000, data 0x85d114/0x8df000, compress 0x0/0x0/0x0, omap 0xc01d, meta 0x1a23fe3), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 22609920 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 79 ms_handle_reset con 0x559f0b2dbc00 session 0x559f09ce7180
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:25.553332+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68476928 unmapped: 22757376 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 80 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b0c9880
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:26.553469+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69566464 unmapped: 21667840 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 81 ms_handle_reset con 0x559f08567c00 session 0x559f0977ee00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:27.553631+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 21610496 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:28.553808+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 21610496 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 620604 data_alloc: 218103808 data_used: 12849
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:29.553965+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 21610496 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:30.554184+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 82 heartbeat osd_stat(store_statfs(0x4fd8d1000/0x0/0x4ffc00000, data 0x86555b/0x8f3000, compress 0x0/0x0/0x0, omap 0xd14b, meta 0x1a22eb5), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 21610496 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 82 handle_osd_map epochs [82,83], i have 82, src has [1,83]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 83 ms_handle_reset con 0x559f08566c00 session 0x559f09cc2c40
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:31.554353+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 20512768 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 84 ms_handle_reset con 0x559f08566400 session 0x559f09d6bdc0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:32.554615+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af3bc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 20488192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 84 handle_osd_map epochs [84,85], i have 84, src has [1,85]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.950626373s of 10.085215569s, submitted: 81
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 85 ms_handle_reset con 0x559f0af3bc00 session 0x559f0af39500
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:33.554789+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 19333120 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 85 handle_osd_map epochs [85,86], i have 85, src has [1,86]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 86 ms_handle_reset con 0x559f08566400 session 0x559f0af39c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:34.554974+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 625457 data_alloc: 218103808 data_used: 12849
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 19038208 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 87 ms_handle_reset con 0x559f08566c00 session 0x559f0af388c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 87 heartbeat osd_stat(store_statfs(0x4fd8d1000/0x0/0x4ffc00000, data 0x8695c0/0x8f9000, compress 0x0/0x0/0x0, omap 0xe12c, meta 0x1a21ed4), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:35.555107+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 18882560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 88 ms_handle_reset con 0x559f08567c00 session 0x559f09cc3180
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:36.555326+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 18857984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:37.555517+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 18808832 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fd8ca000/0x0/0x4ffc00000, data 0x86b287/0x8fc000, compress 0x0/0x0/0x0, omap 0xeb23, meta 0x1a214dd), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:38.555749+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 18808832 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:39.555967+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 629504 data_alloc: 218103808 data_used: 12849
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 18808832 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:40.556177+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 18808832 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fd8ca000/0x0/0x4ffc00000, data 0x86b287/0x8fc000, compress 0x0/0x0/0x0, omap 0xeb23, meta 0x1a214dd), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:41.556354+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 18784256 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:42.556516+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 18784256 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:43.556830+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 18767872 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:44.556998+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 631348 data_alloc: 218103808 data_used: 12849
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 18767872 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.084028244s of 12.300132751s, submitted: 126
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:45.557127+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72531968 unmapped: 18702336 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 90 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b05c380
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:46.557287+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fc726000/0x0/0x4ffc00000, data 0x86e168/0x904000, compress 0x0/0x0/0x0, omap 0xf537, meta 0x2bc0ac9), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:47.557457+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:48.557685+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:49.557800+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 638855 data_alloc: 218103808 data_used: 12849
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:50.557941+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 90 ms_handle_reset con 0x559f0b2db800 session 0x559f09cc3c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fc726000/0x0/0x4ffc00000, data 0x86e168/0x904000, compress 0x0/0x0/0x0, omap 0xf537, meta 0x2bc0ac9), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:51.558080+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72630272 unmapped: 18604032 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 91 handle_osd_map epochs [91,92], i have 91, src has [1,92]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 92 ms_handle_reset con 0x559f08566c00 session 0x559f0af38000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 92 ms_handle_reset con 0x559f08566400 session 0x559f090328c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 92 ms_handle_reset con 0x559f08567c00 session 0x559f0977e700
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:52.558205+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:53.558365+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:54.558547+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 646548 data_alloc: 218103808 data_used: 12865
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:55.558698+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 92 heartbeat osd_stat(store_statfs(0x4fc71e000/0x0/0x4ffc00000, data 0x870d56/0x90a000, compress 0x0/0x0/0x0, omap 0xfe38, meta 0x2bc01c8), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:56.558885+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:57.559076+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:58.559297+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:59.559485+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 646548 data_alloc: 218103808 data_used: 12865
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:00.559653+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 92 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b09f880
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.700133324s of 15.770095825s, submitted: 49
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:01.559846+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 93 heartbeat osd_stat(store_statfs(0x4fc71e000/0x0/0x4ffc00000, data 0x870d56/0x90a000, compress 0x0/0x0/0x0, omap 0xfe38, meta 0x2bc01c8), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 93 ms_handle_reset con 0x559f0b2db400 session 0x559f0b05d180
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 18374656 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:02.560014+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 18374656 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 93 ms_handle_reset con 0x559f08566400 session 0x559f0b05ddc0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:03.560230+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 18341888 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 95 ms_handle_reset con 0x559f08566c00 session 0x559f0b05da40
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:04.560369+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 660485 data_alloc: 218103808 data_used: 12865
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 95 heartbeat osd_stat(store_statfs(0x4fc717000/0x0/0x4ffc00000, data 0x8737db/0x911000, compress 0x0/0x0/0x0, omap 0x10399, meta 0x2bbfc67), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 17973248 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:05.560523+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 17973248 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:06.560701+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 17973248 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:07.560857+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 95 heartbeat osd_stat(store_statfs(0x4fc6f1000/0x0/0x4ffc00000, data 0x898e2a/0x939000, compress 0x0/0x0/0x0, omap 0x10594, meta 0x2bbfa6c), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 17874944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 95 handle_osd_map epochs [95,96], i have 95, src has [1,96]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 96 ms_handle_reset con 0x559f0af1d800 session 0x559f09032fc0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:08.561063+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 17891328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 97 ms_handle_reset con 0x559f0af1d400 session 0x559f0af38700
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:09.562653+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 666817 data_alloc: 218103808 data_used: 19521
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 97 heartbeat osd_stat(store_statfs(0x4fc6ee000/0x0/0x4ffc00000, data 0x89a413/0x93c000, compress 0x0/0x0/0x0, omap 0x10a4b, meta 0x2bbf5b5), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 16842752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0ae9fc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 97 ms_handle_reset con 0x559f0ae9fc00 session 0x559f0b05c540
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:10.562880+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 97 ms_handle_reset con 0x559f08566400 session 0x559f0b078700
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 16678912 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.905013084s of 10.014015198s, submitted: 64
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 98 ms_handle_reset con 0x559f08566c00 session 0x559f099688c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:11.564265+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 16646144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 99 ms_handle_reset con 0x559f0af1d400 session 0x559f0aa6cc40
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:12.564711+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 16629760 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f0af1d800 session 0x559f0977fc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:13.564981+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 16605184 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:14.565158+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 heartbeat osd_stat(store_statfs(0x4fc6e2000/0x0/0x4ffc00000, data 0x89fcaa/0x948000, compress 0x0/0x0/0x0, omap 0x11e47, meta 0x2bbe1b9), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 677227 data_alloc: 218103808 data_used: 19505
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 16605184 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:15.565400+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 16605184 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:16.565713+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 16605184 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0ae9f000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f0ae9f000 session 0x559f0b05d500
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08566c00 session 0x559f0b05c8c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08566400 session 0x559f0b0c9180
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f0af1d400 session 0x559f0b0d8a80
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f0af1d800 session 0x559f0af39500
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:17.565889+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08568400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08568400 session 0x559f09d6bc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08566400 session 0x559f0b079dc0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74825728 unmapped: 16408576 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:18.566019+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 heartbeat osd_stat(store_statfs(0x4fc6e2000/0x0/0x4ffc00000, data 0x89fcaa/0x948000, compress 0x0/0x0/0x0, omap 0x11f49, meta 0x2bbe0b7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74825728 unmapped: 16408576 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:19.566159+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08566c00 session 0x559f0b0eb6c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 678372 data_alloc: 218103808 data_used: 20137
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 heartbeat osd_stat(store_statfs(0x4fc6e2000/0x0/0x4ffc00000, data 0x89fcaa/0x948000, compress 0x0/0x0/0x0, omap 0x11f49, meta 0x2bbe0b7), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74842112 unmapped: 16392192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:20.566370+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74842112 unmapped: 16392192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.915824890s of 10.005084038s, submitted: 54
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:21.566678+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74842112 unmapped: 16392192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:22.566862+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74776576 unmapped: 16457728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 101 ms_handle_reset con 0x559f0b2dbc00 session 0x559f08c38540
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:23.567123+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f0b2db400 session 0x559f090328c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2ea000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2ea400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f0b2ea400 session 0x559f0977ea80
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f0b2ea000 session 0x559f0aa6ddc0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f08566400 session 0x559f0b05da40
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 16064512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:24.567257+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f08566c00 session 0x559f0b0c96c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 689739 data_alloc: 218103808 data_used: 24268
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 103 heartbeat osd_stat(store_statfs(0x4fc6db000/0x0/0x4ffc00000, data 0x8a2782/0x94f000, compress 0x0/0x0/0x0, omap 0x12951, meta 0x2bbd6af), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 75210752 unmapped: 16023552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:25.567424+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2db400 session 0x559f0b083180
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 75218944 unmapped: 16015360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2dbc00 session 0x559f09cc21c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:26.567607+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2dbc00 session 0x559f0aa6d6c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2ea800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2ea800 session 0x559f0b0eaa80
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 75235328 unmapped: 15998976 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2eac00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2eac00 session 0x559f0af38000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:27.567766+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2eb000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 14901248 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:28.567986+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 105 ms_handle_reset con 0x559f0b2eb000 session 0x559f0b082fc0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 14868480 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:29.568177+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 694003 data_alloc: 218103808 data_used: 24268
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 105 ms_handle_reset con 0x559f0af1d400 session 0x559f0b0c9340
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 105 ms_handle_reset con 0x559f0af1d800 session 0x559f0977f340
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 14868480 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 105 heartbeat osd_stat(store_statfs(0x4fc6d2000/0x0/0x4ffc00000, data 0x8a69c9/0x958000, compress 0x0/0x0/0x0, omap 0x1384e, meta 0x2bbc7b2), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:30.568316+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 106 ms_handle_reset con 0x559f0b2dbc00 session 0x559f09cc2000
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76374016 unmapped: 14860288 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:31.568719+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76414976 unmapped: 14819328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:32.568873+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.597695351s of 11.822847366s, submitted: 164
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76414976 unmapped: 14819328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:33.569111+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 107 heartbeat osd_stat(store_statfs(0x4fc6ce000/0x0/0x4ffc00000, data 0x8a948c/0x95c000, compress 0x0/0x0/0x0, omap 0x13f68, meta 0x2bbc098), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 107 ms_handle_reset con 0x559f08567c00 session 0x559f09032e00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 107 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b078e00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76414976 unmapped: 14819328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2ea800
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 107 ms_handle_reset con 0x559f0b2ea800 session 0x559f09033180
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:34.569228+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 695237 data_alloc: 218103808 data_used: 19252
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76324864 unmapped: 14909440 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:35.569404+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 107 heartbeat osd_stat(store_statfs(0x4fc6f4000/0x0/0x4ffc00000, data 0x885469/0x937000, compress 0x0/0x0/0x0, omap 0x13f68, meta 0x2bbc098), peers [0,1] op hist [0,0,0,1])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 107 ms_handle_reset con 0x559f08567c00 session 0x559f090328c0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76324864 unmapped: 14909440 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 108 ms_handle_reset con 0x559f09dcbc00 session 0x559f08c38700
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:36.569549+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:37.569769+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 108 heartbeat osd_stat(store_statfs(0x4fc6ef000/0x0/0x4ffc00000, data 0x886ad2/0x93a000, compress 0x0/0x0/0x0, omap 0x145c3, meta 0x2bbba3d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:38.569937+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 108 heartbeat osd_stat(store_statfs(0x4fc6ef000/0x0/0x4ffc00000, data 0x886ad2/0x93a000, compress 0x0/0x0/0x0, omap 0x145c3, meta 0x2bbba3d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:39.570237+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698135 data_alloc: 218103808 data_used: 19252
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 108 heartbeat osd_stat(store_statfs(0x4fc6ef000/0x0/0x4ffc00000, data 0x886ad2/0x93a000, compress 0x0/0x0/0x0, omap 0x145c3, meta 0x2bbba3d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:40.570449+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:41.570743+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:42.571069+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:43.571436+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.643769264s of 10.903412819s, submitted: 68
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 109 heartbeat osd_stat(store_statfs(0x4fc6ef000/0x0/0x4ffc00000, data 0x886ad2/0x93a000, compress 0x0/0x0/0x0, omap 0x145c3, meta 0x2bbba3d), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:44.571742+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 700845 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:45.571923+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 109 heartbeat osd_stat(store_statfs(0x4fc6ed000/0x0/0x4ffc00000, data 0x887f9e/0x93d000, compress 0x0/0x0/0x0, omap 0x14858, meta 0x2bbb7a8), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:46.572079+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:47.572465+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:48.572764+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:49.573145+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:50.573271+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:51.573488+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:52.573621+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:53.573771+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:54.573884+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:55.574020+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:56.574170+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:57.574381+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:58.574521+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:59.574701+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:00.574909+0000)
Dec 03 21:34:37 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1197832626' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:01.575079+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:02.575248+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:03.575409+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:04.575550+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:05.575789+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:06.576050+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:07.576264+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:08.576432+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:09.576634+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:10.576813+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:11.576964+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:12.577144+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:13.577348+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:14.577768+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:15.577943+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:16.578102+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:17.578449+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:18.578621+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:19.578860+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:20.579031+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:21.579186+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:22.579406+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:23.579645+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:24.579797+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:25.579938+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:26.580096+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:27.580338+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:28.580672+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:29.580905+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:30.581123+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:31.581394+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:32.581642+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:33.581846+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:34.582012+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:35.582180+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:36.582370+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:37.582666+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:38.582883+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:39.583099+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:40.583376+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:41.583595+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:42.583808+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:43.584002+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:44.584195+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:45.584384+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:46.584553+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:47.584712+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:48.584875+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:49.585040+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread fragmentation_score=0.000147 took=0.000030s
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:50.585207+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:51.585425+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:52.585618+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:53.585779+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:54.585918+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:55.586079+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:56.586334+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:57.586547+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:58.586740+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:59.586894+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:00.587023+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:01.587135+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:02.587273+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:03.587411+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76480512 unmapped: 14753792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:04.587530+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'config show' '{prefix=config show}'
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:37 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:37 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77086720 unmapped: 14147584 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:05.587678+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76890112 unmapped: 14344192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:06.587828+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:34:37 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:07.588003+0000)
Dec 03 21:34:37 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76947456 unmapped: 14286848 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:37 compute-0 ceph-osd[88129]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:34:38 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14754 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v889: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 03 21:34:38 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3748945380' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 03 21:34:38 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14758 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:38 compute-0 ceph-mon[75204]: from='client.14752 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:38 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1197832626' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 03 21:34:38 compute-0 ceph-mon[75204]: from='client.14754 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:38 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3748945380' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 03 21:34:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 03 21:34:38 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2192794819' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 03 21:34:39 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14762 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 03 21:34:39 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2014216378' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 03 21:34:39 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14766 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:39 compute-0 ceph-mon[75204]: pgmap v889: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:39 compute-0 ceph-mon[75204]: from='client.14758 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:39 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2192794819' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 03 21:34:39 compute-0 ceph-mon[75204]: from='client.14762 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:39 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2014216378' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 03 21:34:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 03 21:34:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3857402893' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:34:40 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14770 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:40 compute-0 podman[250314]: 2025-12-03 21:34:40.134774119 +0000 UTC m=+0.074588065 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 03 21:34:40 compute-0 podman[250317]: 2025-12-03 21:34:40.148367825 +0000 UTC m=+0.099206877 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 21:34:40 compute-0 crontab[250385]: (root) LIST (root)
Dec 03 21:34:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v890: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:40 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14774 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 03 21:34:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3806866220' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 03 21:34:40 compute-0 ceph-mon[75204]: from='client.14766 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3857402893' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:34:40 compute-0 ceph-mon[75204]: from='client.14770 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3806866220' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 03 21:34:40 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14776 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 03 21:34:41 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/726762375' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 03 21:34:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:41 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14780 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:41 compute-0 ceph-mon[75204]: pgmap v890: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:41 compute-0 ceph-mon[75204]: from='client.14774 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:41 compute-0 ceph-mon[75204]: from='client.14776 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:41 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/726762375' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 03 21:34:42 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14784 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:42 compute-0 ceph-mgr[75500]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 03 21:34:42 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: 2025-12-03T21:34:42.009+0000 7fa8c63a3640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 03 21:34:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v891: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 03 21:34:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1602125538' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 03 21:34:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 03 21:34:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2487170743' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864461899s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974800110s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864461899s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974800110s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864461899s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974800110s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864461899s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974800110s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864461899s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974800110s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.072302 1 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.074917 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.074968 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.074984 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927361488s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.037757874s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927351952s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037757874s@ mbc={}] exit Reset 0.000017 1 0.000030
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927351952s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037757874s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927351952s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037757874s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927351952s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037757874s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927351952s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037757874s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.927351952s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.037757874s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.135186 14 0.000091
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.143368 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.143496 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.143526 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864676476s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975151062s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864668846s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975151062s@ mbc={}] exit Reset 0.000018 1 0.000032
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864668846s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975151062s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864668846s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975151062s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864668846s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975151062s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864668846s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975151062s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864668846s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975151062s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.135368 14 0.000081
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.143436 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.143633 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.143708 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864478111s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.975067139s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864466667s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975067139s@ mbc={}] exit Reset 0.000022 1 0.000044
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864466667s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975067139s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864466667s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975067139s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864466667s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975067139s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864466667s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975067139s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864466667s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.975067139s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.070823 1 0.000088
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075054 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075175 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075208 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928797722s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039489746s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928786278s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039489746s@ mbc={}] exit Reset 0.000023 1 0.000045
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928786278s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039489746s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928786278s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039489746s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928786278s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039489746s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928786278s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039489746s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928786278s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039489746s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.135644 14 0.000067
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.143686 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.143995 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.144031 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864211082s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974967957s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864202499s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974967957s@ mbc={}] exit Reset 0.000028 1 0.000029
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864202499s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974967957s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864202499s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974967957s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864202499s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974967957s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864202499s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974967957s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.864202499s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974967957s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.071018 1 0.000052
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075241 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075282 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075300 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928587914s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039421082s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928578377s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039421082s@ mbc={}] exit Reset 0.000018 1 0.000032
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928578377s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039421082s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928578377s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039421082s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928578377s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039421082s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928578377s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039421082s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928578377s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039421082s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.135847 14 0.000080
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.144118 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.144218 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.144241 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.071180 1 0.000065
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075225 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863828659s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974784851s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075321 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075347 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863812447s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] exit Reset 0.000041 1 0.000055
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863812447s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863812447s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863812447s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863812447s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863812447s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928451538s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039451599s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928428650s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039451599s@ mbc={}] exit Reset 0.000048 1 0.000078
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928428650s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039451599s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.136256 14 0.000065
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928428650s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039451599s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.144335 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928428650s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039451599s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.144440 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928428650s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039451599s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.144540 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928428650s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039451599s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863724709s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974784851s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863713264s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] exit Reset 0.000022 1 0.000037
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863713264s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863713264s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863713264s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863713264s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863713264s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.071263 1 0.000116
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075296 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075417 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075459 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.136433 14 0.000425
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.144524 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928549767s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039695740s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.144672 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928540230s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039695740s@ mbc={}] exit Reset 0.000019 1 0.000032
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.144703 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928540230s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039695740s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928540230s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039695740s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928540230s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039695740s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928540230s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039695740s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928540230s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039695740s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863598824s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974769592s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863583565s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974769592s@ mbc={}] exit Reset 0.000045 1 0.000051
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863583565s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974769592s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.136453 14 0.000080
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.144670 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863583565s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974769592s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.144811 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863583565s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974769592s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863583565s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974769592s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.144830 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863583565s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974769592s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863542557s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974784851s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863531113s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] exit Reset 0.000021 1 0.000043
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863531113s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863531113s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863531113s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863531113s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863531113s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974784851s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.071248 1 0.000068
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075328 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.071301 1 0.000047
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075512 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075322 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075467 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075540 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075493 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928400040s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039718628s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928479195s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039802551s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928391457s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039718628s@ mbc={}] exit Reset 0.000018 1 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928391457s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039718628s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928391457s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039718628s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928391457s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039718628s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928464890s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039802551s@ mbc={}] exit Reset 0.000030 1 0.000047
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928391457s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039718628s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928391457s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039718628s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928464890s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039802551s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928464890s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039802551s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928464890s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039802551s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928464890s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039802551s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928464890s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039802551s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.141038 14 0.000093
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.145119 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.145353 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.145378 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858943939s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.970329285s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858925819s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970329285s@ mbc={}] exit Reset 0.000030 1 0.000045
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858925819s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970329285s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858925819s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970329285s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.071366 1 0.000059
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858925819s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970329285s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075404 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858925819s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970329285s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858925819s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970329285s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075459 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075507 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928402901s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039848328s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928389549s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039848328s@ mbc={}] exit Reset 0.000029 1 0.000046
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.136607 14 0.000144
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.144981 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928389549s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039848328s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.145437 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928389549s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039848328s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928389549s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039848328s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.145464 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928389549s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039848328s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928389549s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039848328s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863302231s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.974792480s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863292694s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974792480s@ mbc={}] exit Reset 0.000023 1 0.000038
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863292694s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974792480s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863292694s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974792480s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863292694s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974792480s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863292694s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974792480s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.863292694s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.974792480s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.071456 1 0.000055
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075354 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.141402 14 0.000135
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075469 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.145225 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.145358 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075498 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.145390 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858806610s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.970382690s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928353310s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039932251s@ mbc={}] PeeringState::start_peering_interval up [1] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858794212s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970382690s@ mbc={}] exit Reset 0.000022 1 0.000035
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858794212s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970382690s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928339958s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039932251s@ mbc={}] exit Reset 0.000030 1 0.000057
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858794212s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970382690s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928339958s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039932251s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858794212s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970382690s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928339958s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039932251s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858794212s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970382690s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928339958s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039932251s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858794212s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970382690s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928339958s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039932251s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928339958s) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039932251s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 7.071529 1 0.000033
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 7.075485 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.141455 14 0.000157
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 7.075542 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.145358 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.145486 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] exit Started 7.075563 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=39) [1] r=0 lpr=39 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 11.145524 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=35) [1] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928306580s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 active pruub 92.039985657s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858775139s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 95.970458984s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928293228s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039985657s@ mbc={}] exit Reset 0.000028 1 0.000046
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928293228s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039985657s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928293228s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039985657s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928293228s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039985657s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858759880s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970458984s@ mbc={}] exit Reset 0.000030 1 0.000049
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928293228s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039985657s@ mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41 pruub=8.928293228s) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY pruub 92.039985657s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858759880s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970458984s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858759880s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970458984s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858759880s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970458984s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858759880s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970458984s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41 pruub=12.858759880s) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 95.970458984s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/GetLog 0.007640 2 0.000043
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007408 2 0.000038
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetLog 0.007232 2 0.000040
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007049 2 0.000028
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetLog 0.006920 2 0.000021
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006800 2 0.000020
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.006658 2 0.000021
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000050 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000012
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000024
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000034 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000009
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000050 1 0.000025
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000047 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000125 1 0.000130
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000067 1 0.000029
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000035 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000045 1 0.000025
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000029 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000023 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000045 1 0.000042
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000033 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000008
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000036 1 0.000022
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000027 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000045 1 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000153 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000033 1 0.000049
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000013 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000148 1 0.000080
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000052 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000010
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000062 1 0.000030
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000032 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000008
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000039 1 0.000027
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000032 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000014
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000038 1 0.000021
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000085 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000020
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000084 1 0.000048
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000052 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000013
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000062 1 0.000032
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000086 1 0.000040
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000075 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000087 1 0.000047
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000048 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000013
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000054 1 0.000033
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000036 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000011
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000053 1 0.000036
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000069 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000018 1 0.000026
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000080 1 0.000046
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000035 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000010
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000044 1 0.000023
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000033 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000008
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000047 1 0.000022
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000049 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000008
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000061 1 0.000044
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019423 2 0.000022
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.019954 2 0.000030
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.016319 2 0.000025
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.016168 2 0.000021
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.1( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.023069 2 0.000039
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.1( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.1( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.1( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.022473 2 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetLog 0.022394 2 0.000017
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.022277 2 0.000018
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021967 2 0.000036
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetLog 0.022125 2 0.000023
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021627 2 0.000082
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021839 2 0.000026
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021460 2 0.000063
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.021168 2 0.000043
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.023124 2 0.000027
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetLog 0.023567 2 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.016866 2 0.000025
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.016603 2 0.000018
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.016502 2 0.000016
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.016403 2 0.000029
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015846 2 0.000085
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015664 2 0.000022
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.017017 2 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015446 2 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011716 2 0.000036
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011174 2 0.000026
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000040 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015655 2 0.000018
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000033 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009529 2 0.000027
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009386 2 0.000022
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008901 2 0.000035
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000018 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000036 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000084 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010162 2 0.000023
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007575 2 0.000033
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007427 2 0.000017
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009954 2 0.000036
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000033 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007207 2 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000011 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 41 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 62611456 unmapped: 294912 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:08.549946+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 15 sent 13 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:38.521961+0000 osd.1 (osd.1) 14 : cluster [DBG] 7.1e scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:38.532483+0000 osd.1 (osd.1) 15 : cluster [DBG] 7.1e scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 15)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:38.521961+0000 osd.1 (osd.1) 14 : cluster [DBG] 7.1e scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:38.532483+0000 osd.1 (osd.1) 15 : cluster [DBG] 7.1e scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 41 handle_osd_map epochs [41,42], i have 41, src has [1,42]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 41 handle_osd_map epochs [42,42], i have 42, src has [1,42]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.004047 2 0.000030
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.011584 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.004025 2 0.000039
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering 1.011406 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 unknown m=2 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.004087 2 0.000047
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.011237 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.004011 2 0.000035
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.003997 2 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.010872 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987564 2 0.000022
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.010731 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987406 2 0.000022
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.009125 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987426 2 0.000023
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.008985 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987441 2 0.000029
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.008745 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.988400 2 0.000020
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering 1.011046 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.007913 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 unknown m=2 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.988003 2 0.000015
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.004247 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.004630 2 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.011370 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987875 2 0.000058
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering 1.010095 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 unknown m=1 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 activating+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987974 2 0.000021
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.010023 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.988415 2 0.000021
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.004817 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.982640 2 0.000024
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999315 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.982426 2 0.000041
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.982785 2 0.000041
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999751 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999602 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.005723 2 0.000035
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering 1.013600 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 unknown m=3 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.983075 2 0.000030
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999646 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.988845 2 0.000032
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.983085 2 0.000026
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.010780 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999117 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.983348 2 0.000026
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 activating+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999834 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989448 2 0.000035
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.011811 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.983235 2 0.000086
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.992748 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.983565 2 0.000042
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999302 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989879 2 0.000035
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering 1.012348 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 unknown m=1 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.990075 2 0.000055
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.012660 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989930 2 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.013146 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989823 2 0.000068
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering 1.013484 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 unknown m=2 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984258 2 0.000115
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.996196 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984441 2 0.000059
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999954 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984220 2 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.994303 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984806 2 0.000034
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.000568 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984659 2 0.000058
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.995955 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984639 2 0.000129
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984681 2 0.000126
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.994988 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984859 2 0.000154
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.993894 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.991607 2 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.011693 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.994350 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984824 2 0.000095
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.992500 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 42 handle_osd_map epochs [42,42], i have 42, src has [1,42]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984980 2 0.000061
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.992491 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985220 2 0.000151
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.992622 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005264 4 0.000483
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.d( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/Activating 0.007345 4 0.000294
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007641 4 0.000207
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007566 4 0.000084
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.2( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000469 1 0.000077
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007566 4 0.000092
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.14( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007513 4 0.000075
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007447 4 0.000095
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.12( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007328 4 0.000060
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.1b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007279 4 0.000059
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.11( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.4( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007582 4 0.000141
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.10( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/Activating 0.007296 4 0.000082
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007179 4 0.000140
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007016 4 0.000070
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.17( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.9( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008118 4 0.000187
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.f( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006839 4 0.000047
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006964 4 0.000065
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.15( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.13( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006868 4 0.000136
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/Activating 0.006613 4 0.000194
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006444 4 0.000059
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.12( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.9( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006564 4 0.000062
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.16( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006502 4 0.000059
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006273 4 0.000069
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.006038 4 0.000136
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.8( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008786 4 0.000094
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.3( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008404 4 0.000254
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.5( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/Activating 0.008703 4 0.000154
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008368 4 0.000734
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.a( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008363 4 0.000047
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[4.7( empty local-lis/les=41/42 n=0 ec=35/18 lis/c=41/35 les/c/f=42/37/0 sis=41) [1] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008358 4 0.000076
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.5( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008456 4 0.000068
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/Activating 0.008512 4 0.000163
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008239 4 0.000050
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.4( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.7( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008203 4 0.000035
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008429 4 0.000129
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.6( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/Activating 0.009964 4 0.002068
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008008 4 0.000071
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008148 4 0.000105
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.f( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.c( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008020 4 0.000048
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1d( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007856 4 0.000053
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.19( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008085 4 0.000393
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[2.9( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [1] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.008114 4 0.000045
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.1a( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.007680 4 0.000052
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.016558 7 0.000126
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[5.18( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.019142 7 0.000039
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.026285 7 0.000053
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.025386 7 0.000041
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.025612 7 0.000073
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.026279 7 0.000104
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030391 7 0.000053
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031150 7 0.000097
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.027714 7 0.000044
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.028240 7 0.000069
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029597 7 0.000038
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029859 7 0.000053
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030076 7 0.000058
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030432 7 0.000036
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.032746 7 0.000040
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030925 7 0.000067
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030855 7 0.000034
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030852 7 0.000033
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031334 7 0.000038
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031263 7 0.000100
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029704 7 0.000036
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029801 7 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031794 7 0.000037
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031695 7 0.000041
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030591 7 0.000049
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.032150 7 0.000036
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.032429 7 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.032849 7 0.000038
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031720 7 0.000030
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.032659 7 0.000041
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031445 7 0.000034
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031323 7 0.000043
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031751 7 0.000034
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.033107 7 0.000103
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031382 7 0.000065
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031044 7 0.000050
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.030988 7 0.000061
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031533 7 0.000029
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031087 7 0.000033
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.032142 7 0.000059
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.031458 7 0.000041
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.068217 2 0.000062
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.068134 2 0.000057
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000010 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 1130496 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.223052 1 0.000114
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.291114 2 0.000028
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.214071 1 0.000126
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000024 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.502137 2 0.000025
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.058969 1 0.000095
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000025 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.560842 2 0.000119
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000020 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.737629 2 0.000026
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.177085 1 0.000085
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.114085 1 0.000105
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000020 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/37 les/c/f=42/38/0 sis=41) [1] r=0 lpr=41 pi=[37,41)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.851553 1 0.000018
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.851675 1 0.000013
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.845792 1 0.000037
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.18( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.845848 1 0.000024
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.845995 1 0.000091
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.16( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.846167 1 0.000148
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.11( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841339 1 0.000050
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841329 1 0.000044
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841456 1 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841700 1 0.000190
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840234 1 0.000067
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840313 1 0.000033
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840374 1 0.000018
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840484 1 0.000124
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840600 1 0.000054
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840668 1 0.000019
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840741 1 0.000018
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840738 1 0.000023
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840844 1 0.000024
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841225 1 0.000052
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841351 1 0.000018
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841412 1 0.000050
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840687 1 0.000754
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840742 1 0.000017
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840834 1 0.000020
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.840906 1 0.000016
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.8( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841008 1 0.000017
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.11( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841114 1 0.000051
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841167 1 0.000021
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.5( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841113 1 0.000065
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841159 1 0.000041
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.2( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841244 1 0.000040
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.7( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841192 1 0.000025
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.15( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841273 1 0.000043
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841316 1 0.000023
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.c( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841489 1 0.000037
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1d( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841544 1 0.000022
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1a( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841609 1 0.000020
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.8( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841621 1 0.000059
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1e( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841657 1 0.000058
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.5( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.841711 1 0.000053
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.e( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1f( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.009797 1 0.000180
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1f( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.861409 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.1f( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.877998 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.12( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.015301 1 0.000210
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.12( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.867016 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.12( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.886187 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.18( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.022171 1 0.000072
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.18( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.868016 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.18( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.894350 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.029487 1 0.000080
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.875410 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.901725 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.16( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.036653 1 0.000054
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.16( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.882724 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.16( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.908383 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:09.550172+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.043878 1 0.000091
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.890102 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.11( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.915533 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.15( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.051158 1 0.000049
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.15( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.892578 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.15( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.923014 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.058525 1 0.000070
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.899917 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.1b( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.927668 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.9( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.065727 1 0.000080
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.9( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.907229 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.9( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.935513 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.17( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.073072 1 0.000046
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.17( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.914853 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.17( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.946064 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.a( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.080329 1 0.000038
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.a( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.920611 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.a( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.950273 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.087647 1 0.000038
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.928009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.3( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.957907 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.094977 1 0.000036
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.935396 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.13( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.968174 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.6( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.102270 1 0.000059
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.6( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.942825 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.6( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.972987 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.109735 1 0.000039
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.950458 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.981408 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.3( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.117104 1 0.000113
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.3( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.957750 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.3( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.988208 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.124311 1 0.000035
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.965097 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.6( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.995975 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.131483 1 0.000083
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.972273 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.4( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.003150 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.c( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.138972 1 0.000361
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.c( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.979871 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[3.c( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.011167 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 42 heartbeat osd_stat(store_statfs(0x4fe0e6000/0x0/0x4ffc00000, data 0x9cc8f/0xe4000, compress 0x0/0x0/0x0, omap 0x742b, meta 0x1a28bd5), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 42 handle_osd_map epochs [43,43], i have 42, src has [1,43]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.148147 1 0.000060
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.989448 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 42 pg[7.9( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.020817 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1b( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.153098 4 0.000064
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1b( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.994508 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1b( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.024237 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.160500 4 0.000088
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.001965 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1f( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.031820 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.f( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.167859 4 0.000056
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.f( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.008616 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.f( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.041148 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.175123 4 0.000044
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.015911 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.047631 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.182393 4 0.000087
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.023275 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.18( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.053903 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.189882 4 0.000063
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.030856 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.8( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.063037 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.11( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.197042 4 0.000077
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.11( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.038092 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.11( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.070965 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.204241 4 0.000051
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.045411 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.077877 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.211531 4 0.000076
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.052776 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.5( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.084526 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.e( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.218829 4 0.000051
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.e( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.059993 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.e( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.092704 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.226098 4 0.000077
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.067301 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.2( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.098779 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.7( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.233665 4 0.000067
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.7( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.074972 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.7( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.106327 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.241100 4 0.000078
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.082347 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.15( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.115483 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.248354 4 0.000043
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.089689 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.121475 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=0 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000170 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=0 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000022 1 0.000046
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000018 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000230 1 0.000121
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=0 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000120 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=0 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000026
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000088 1 0.000045
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=0 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000054 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=0 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000017
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000067 1 0.000044
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=0 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000086 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=0 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000033
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000135 1 0.000062
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.001474 2 0.000041
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetLog 0.001325 2 0.000353
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetMissing 0.000010 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000647 2 0.000064
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetLog 0.002226 2 0.000075
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[6.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.257454 4 0.000083
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.098844 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.c( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.130255 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1d( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.264106 4 0.000051
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1d( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.105652 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1d( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.136722 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.269901 4 0.000081
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.111487 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.1a( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.142503 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.8( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.277365 4 0.000042
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.8( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.119015 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.8( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.150568 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1e( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.284566 4 0.000042
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1e( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.126233 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.1e( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.157369 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.5( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.291974 4 0.000050
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.5( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.133686 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[3.5( empty lb MIN local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.165877 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.299386 4 0.000031
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 1.141150 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 43 pg[7.e( empty lb MIN local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [2] r=-1 lpr=41 pi=[39,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 2.172649 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 361585 data_alloc: 218103808 data_used: 0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63021056 unmapped: 1982464 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:10.550304+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 43 handle_osd_map epochs [43,44], i have 43, src has [1,44]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/WaitUpThru 0.923857 2 0.000167
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/WaitUpThru 0.924093 2 0.000079
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering 0.925555 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 unknown m=1 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.924129 2 0.000046
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.924965 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 activating+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering 0.926398 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 unknown m=1 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.924929 2 0.000078
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.926561 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 44 handle_osd_map epochs [44,44], i have 44, src has [1,44]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002694 4 0.000117
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/Activating 0.002750 4 0.000156
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000024 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000333 1 0.000195
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/Activating 0.005096 5 0.000486
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004689 4 0.000126
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000017 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.008079 2 0.000090
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000017 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=43/44 n=2 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.006019 1 0.000072
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000011 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 lc 31'19 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.066848 1 0.000091
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000018 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/37 les/c/f=44/38/0 sis=43) [1] r=0 lpr=43 pi=[37,43)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63127552 unmapped: 1875968 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:11.550426+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 1 last_log 16 sent 15 num 1 unsent 1 sending 1
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:41.548134+0000 osd.1 (osd.1) 16 : cluster [DBG] 3.1a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 16)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:41.548134+0000 osd.1 (osd.1) 16 : cluster [DBG] 3.1a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63086592 unmapped: 1916928 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 44 heartbeat osd_stat(store_statfs(0x4fe0dc000/0x0/0x4ffc00000, data 0x9f725/0xea000, compress 0x0/0x0/0x0, omap 0x7aef, meta 0x1a28511), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:12.550625+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 1 last_log 17 sent 16 num 1 unsent 1 sending 1
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:41.558670+0000 osd.1 (osd.1) 17 : cluster [DBG] 3.1a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 17)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:41.558670+0000 osd.1 (osd.1) 17 : cluster [DBG] 3.1a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63102976 unmapped: 1900544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 44 heartbeat osd_stat(store_statfs(0x4fe0e2000/0x0/0x4ffc00000, data 0x9f725/0xea000, compress 0x0/0x0/0x0, omap 0x7aef, meta 0x1a28511), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:13.550817+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 44 heartbeat osd_stat(store_statfs(0x4fe0e2000/0x0/0x4ffc00000, data 0x9f725/0xea000, compress 0x0/0x0/0x0, omap 0x7aef, meta 0x1a28511), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63102976 unmapped: 1900544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.783351898s of 10.039956093s, submitted: 421
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:14.550994+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 3 last_log 20 sent 17 num 3 unsent 3 sending 3
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:43.587463+0000 osd.1 (osd.1) 18 : cluster [DBG] 7.1d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:43.597996+0000 osd.1 (osd.1) 19 : cluster [DBG] 7.1d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:44.544613+0000 osd.1 (osd.1) 20 : cluster [DBG] 3.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 44 heartbeat osd_stat(store_statfs(0x4fe0e2000/0x0/0x4ffc00000, data 0x9f725/0xea000, compress 0x0/0x0/0x0, omap 0x7aef, meta 0x1a28511), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 20)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:43.587463+0000 osd.1 (osd.1) 18 : cluster [DBG] 7.1d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:43.597996+0000 osd.1 (osd.1) 19 : cluster [DBG] 7.1d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:44.544613+0000 osd.1 (osd.1) 20 : cluster [DBG] 3.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 372816 data_alloc: 218103808 data_used: 0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63102976 unmapped: 1900544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:15.551255+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 1 last_log 21 sent 20 num 1 unsent 1 sending 1
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:44.555228+0000 osd.1 (osd.1) 21 : cluster [DBG] 3.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 21)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:44.555228+0000 osd.1 (osd.1) 21 : cluster [DBG] 3.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63102976 unmapped: 1900544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:16.551522+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63102976 unmapped: 1900544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:17.551700+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 23 sent 21 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:47.495256+0000 osd.1 (osd.1) 22 : cluster [DBG] 7.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:47.505792+0000 osd.1 (osd.1) 23 : cluster [DBG] 7.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 44 handle_osd_map epochs [44,45], i have 44, src has [1,45]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 8.811632 8 0.000173
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 9.110325 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 10.120440 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 10.120484 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.897034645s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 active pruub 108.124214172s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 8.538527 8 0.000156
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 9.108539 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896965027s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124214172s@ mbc={}] exit Reset 0.000127 1 0.000214
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 10.120953 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896965027s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124214172s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896965027s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124214172s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896965027s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124214172s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 10.120984 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896965027s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124214172s@ mbc={}] exit Start 0.000021 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896965027s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124214172s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900068283s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 active pruub 108.127418518s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900019646s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127418518s@ mbc={}] exit Reset 0.000094 1 0.000194
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900019646s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127418518s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900019646s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127418518s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900019646s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127418518s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900019646s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127418518s@ mbc={}] exit Start 0.000019 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 8.247598 8 0.000177
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900019646s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127418518s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 9.110921 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 10.122479 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 10.122501 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900277138s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 active pruub 108.127799988s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900236130s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127799988s@ mbc={}] exit Reset 0.000072 1 0.000107
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900236130s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127799988s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900236130s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127799988s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900236130s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127799988s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900236130s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127799988s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.900236130s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.127799988s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 8.598043 8 0.000157
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 9.110061 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 10.123687 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 10.123725 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896610260s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 active pruub 108.124343872s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896554947s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124343872s@ mbc={}] exit Reset 0.000096 1 0.000148
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896554947s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124343872s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896554947s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124343872s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896554947s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124343872s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896554947s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124343872s@ mbc={}] exit Start 0.000020 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 45 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=14.896554947s) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY pruub 108.124343872s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 23)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:47.495256+0000 osd.1 (osd.1) 22 : cluster [DBG] 7.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:47.505792+0000 osd.1 (osd.1) 23 : cluster [DBG] 7.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 45 handle_osd_map epochs [45,45], i have 45, src has [1,45]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63111168 unmapped: 1892352 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:18.551959+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 45 handle_osd_map epochs [45,46], i have 45, src has [1,46]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 1.024282 7 0.000122
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 1.024614 7 0.000064
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 1.024911 7 0.000110
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 1.024826 7 0.000104
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 46 handle_osd_map epochs [46,46], i have 46, src has [1,46]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.011369 2 0.000086
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.011413 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000143 1 0.000080
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 DELETING pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.123994 2 0.000241
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.124251 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 1.160684 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63119360 unmapped: 1884160 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.205433 2 0.000047
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.205474 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000132 1 0.000092
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.264005 2 0.000029
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.264038 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000096 1 0.000063
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 DELETING pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.137502 2 0.000212
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.137695 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 1.367532 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 DELETING pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.153458 2 0.000223
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.153638 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 1.442571 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.419752 2 0.000089
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.419790 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000099 1 0.000102
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 DELETING pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.019383 2 0.000133
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.019524 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=-1 lpr=45 pi=[41,45)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 1.464016 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 46 heartbeat osd_stat(store_statfs(0x4fe0dd000/0x0/0x4ffc00000, data 0xa0d3b/0xed000, compress 0x0/0x0/0x0, omap 0x7d8d, meta 0x1a28273), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:19.552114+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 46 handle_osd_map epochs [46,47], i have 46, src has [1,47]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 368741 data_alloc: 218103808 data_used: 0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63168512 unmapped: 1835008 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:20.552299+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 25 sent 23 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:50.470528+0000 osd.1 (osd.1) 24 : cluster [DBG] 3.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:50.481054+0000 osd.1 (osd.1) 25 : cluster [DBG] 3.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 25)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:50.470528+0000 osd.1 (osd.1) 24 : cluster [DBG] 3.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:50.481054+0000 osd.1 (osd.1) 25 : cluster [DBG] 3.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63234048 unmapped: 1769472 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:21.552499+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 47 handle_osd_map epochs [47,48], i have 47, src has [1,48]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 12.410961 17 0.000433
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 13.157714 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 14.171227 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 14.171329 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 13.084869 17 0.000147
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 13.161128 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 14.172559 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 14.172595 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850582123s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 active pruub 108.127700806s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846266747s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 active pruub 108.123405457s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850494385s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.127700806s@ mbc={}] exit Reset 0.000157 1 0.000365
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850494385s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.127700806s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850494385s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.127700806s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846174240s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.123405457s@ mbc={}] exit Reset 0.000136 1 0.000189
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850494385s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.127700806s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846174240s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.123405457s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850494385s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.127700806s@ mbc={}] exit Start 0.000018 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846174240s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.123405457s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846174240s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.123405457s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.850494385s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.127700806s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846174240s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.123405457s@ mbc={}] exit Start 0.000015 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48 pruub=10.846174240s) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY pruub 108.123405457s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 48 handle_osd_map epochs [47,48], i have 48, src has [1,48]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=0 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000104 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=0 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000020 1 0.000039
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000142 1 0.000061
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=0 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000078 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=0 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000030
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000108 1 0.000051
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetLog 0.001060 2 0.000061
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=4 mbc={}] exit Started/Primary/Peering/GetLog 0.000945 2 0.000061
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=4 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=4 mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 48 pg[6.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=4 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 63234048 unmapped: 1769472 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:22.552670+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 48 handle_osd_map epochs [48,49], i have 48, src has [1,49]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 48 handle_osd_map epochs [49,49], i have 49, src has [1,49]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.007568 2 0.000073
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering 1.008849 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 unknown m=1 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 1.017124 6 0.000095
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 1.017222 6 0.000126
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=4 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.007055 2 0.000107
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 peering m=4 mbc={}] exit Started/Primary/Peering 1.008180 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 0'0 unknown m=4 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=4 mbc={255={(0+1)=4}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/Activating 0.005150 4 0.000241
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(0+1)=4}}] exit Started/Primary/Active/Activating 0.005203 4 0.000919
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(0+1)=4}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000653 1 0.000329
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000011 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 lc 31'17 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 49 heartbeat osd_stat(store_statfs(0x4fe0d7000/0x0/0x4ffc00000, data 0xa4b5f/0xf3000, compress 0x0/0x0/0x0, omap 0x87b7, meta 0x1a27849), peers [0,2] op hist [0,0,0,0,0,1])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.070162 2 0.000078
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000023 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=47/49 n=1 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=4 mbc={255={(0+1)=4}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.070683 2 0.000130
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=4 mbc={255={(0+1)=4}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=4 mbc={255={(0+1)=4}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000021 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 lc 31'15 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=4 mbc={255={(0+1)=4}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.081930 3 0.000083
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.082007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 64299008 unmapped: 704512 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.222372 3 0.000114
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.222425 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.277125 1 0.000132
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=47/49 n=2 ec=37/21 lis/c=47/37 les/c/f=49/38/0 sis=47) [1] r=0 lpr=48 pi=[37,47)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.271493 1 0.000155
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.131538 1 0.000127
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 DELETING pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.029365 2 0.000273
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.161008 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 1.400818 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 DELETING pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.044731 2 0.000684
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.316364 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=-1 lpr=48 pi=[41,48)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 1.415595 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 49 heartbeat osd_stat(store_statfs(0x4fe0d7000/0x0/0x4ffc00000, data 0xa4b5f/0xf3000, compress 0x0/0x0/0x0, omap 0x87b7, meta 0x1a27849), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:23.552805+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 27 sent 25 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:53.501491+0000 osd.1 (osd.1) 26 : cluster [DBG] 7.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:53.512053+0000 osd.1 (osd.1) 27 : cluster [DBG] 7.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 27)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:53.501491+0000 osd.1 (osd.1) 26 : cluster [DBG] 7.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:53.512053+0000 osd.1 (osd.1) 27 : cluster [DBG] 7.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 64389120 unmapped: 614400 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:24.553026+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 379561 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 64389120 unmapped: 614400 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.780271530s of 10.901998520s, submitted: 64
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:25.553199+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 29 sent 27 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:55.446606+0000 osd.1 (osd.1) 28 : cluster [DBG] 3.13 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:10:55.456906+0000 osd.1 (osd.1) 29 : cluster [DBG] 3.13 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 29)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:55.446606+0000 osd.1 (osd.1) 28 : cluster [DBG] 3.13 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:10:55.456906+0000 osd.1 (osd.1) 29 : cluster [DBG] 3.13 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:26.553419+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:27.553624+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:28.553821+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 581632 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 49 handle_osd_map epochs [50,51], i have 49, src has [1,51]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 51 heartbeat osd_stat(store_statfs(0x4fe0d5000/0x0/0x4ffc00000, data 0xa659d/0xf7000, compress 0x0/0x0/0x0, omap 0x8ce3, meta 0x1a2731d), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 51 handle_osd_map epochs [52,52], i have 52, src has [1,52]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:29.553961+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 391474 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65527808 unmapped: 524288 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.17 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.17 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:30.554120+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 31 sent 29 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:00.433539+0000 osd.1 (osd.1) 30 : cluster [DBG] 7.17 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:00.444425+0000 osd.1 (osd.1) 31 : cluster [DBG] 7.17 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 31)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:00.433539+0000 osd.1 (osd.1) 30 : cluster [DBG] 7.17 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:00.444425+0000 osd.1 (osd.1) 31 : cluster [DBG] 7.17 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65552384 unmapped: 499712 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:31.554284+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 52 heartbeat osd_stat(store_statfs(0x4fe0c8000/0x0/0x4ffc00000, data 0xaa7df/0x100000, compress 0x0/0x0/0x0, omap 0x922d, meta 0x1a26dd3), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 52 handle_osd_map epochs [53,53], i have 53, src has [1,53]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 22.910079 31 0.000113
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 22.916168 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 23.927995 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 23.928020 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=41) [1] r=0 lpr=41 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.090121269s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 active pruub 116.124732971s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.089967728s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 116.124732971s@ mbc={}] exit Reset 0.000245 1 0.000320
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.089967728s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 116.124732971s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.089967728s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 116.124732971s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.089967728s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 116.124732971s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.089967728s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 116.124732971s@ mbc={}] exit Start 0.000138 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 53 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53 pruub=9.089967728s) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 116.124732971s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 53 handle_osd_map epochs [53,54], i have 53, src has [1,54]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=43) [1] r=0 lpr=43 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 21.257866 28 0.000187
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=43) [1] r=0 lpr=43 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 21.260673 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=43) [1] r=0 lpr=43 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 22.185658 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=43) [1] r=0 lpr=43 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 22.185691 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=43) [1] r=0 lpr=43 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742419243s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 active pruub 118.168533325s@ mbc={}] PeeringState::start_peering_interval up [1] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 0, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742380142s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.168533325s@ mbc={}] exit Reset 0.000070 1 0.000120
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742380142s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.168533325s@ mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742380142s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.168533325s@ mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742380142s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.168533325s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742380142s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.168533325s@ mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54 pruub=10.742380142s) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.168533325s@ mbc={}] enter Started/Stray
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 54 handle_osd_map epochs [54,54], i have 54, src has [1,54]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 0.398361 7 0.000300
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000064 1 0.000092
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=-1 lpr=53 DELETING pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.002234 1 0.000044
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.002369 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] lb MIN local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=-1 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started 0.400984 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65634304 unmapped: 417792 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:32.554432+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65724416 unmapped: 327680 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 54 handle_osd_map epochs [55,55], i have 54, src has [1,55]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.442408 6 0.000095
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000096 1 0.000046
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] lb MIN local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=-1 lpr=54 DELETING pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.002435 2 0.000033
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] lb MIN local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.002569 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] lb MIN local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=-1 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started 1.445023 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:33.554585+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 270336 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 55 handle_osd_map epochs [56,56], i have 55, src has [1,56]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=0 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000108 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=0 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000018 1 0.000040
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000177 1 0.000054
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetLog 0.001527 2 0.000049
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetMissing 0.000041 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 56 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:34.554753+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 33 sent 31 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:04.440886+0000 osd.1 (osd.1) 32 : cluster [DBG] 7.16 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:04.451729+0000 osd.1 (osd.1) 33 : cluster [DBG] 7.16 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 405546 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 56 handle_osd_map epochs [56,57], i have 56, src has [1,57]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/WaitUpThru 0.523468 2 0.000303
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering 0.525418 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 unknown m=1 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 activating+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 33)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:04.440886+0000 osd.1 (osd.1) 32 : cluster [DBG] 7.16 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:04.451729+0000 osd.1 (osd.1) 33 : cluster [DBG] 7.16 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/Activating 0.003178 3 0.000738
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000193 1 0.000095
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000054 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.008014 3 0.000251
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000045 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=56/57 n=1 ec=37/21 lis/c=56/45 les/c/f=57/46/0 sis=56) [1] r=0 lpr=56 pi=[45,56)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65806336 unmapped: 245760 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:35.555006+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 212992 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 57 heartbeat osd_stat(store_statfs(0x4fe0b7000/0x0/0x4ffc00000, data 0xb1331/0x10f000, compress 0x0/0x0/0x0, omap 0x9f2d, meta 0x1a260d3), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 57 handle_osd_map epochs [58,58], i have 57, src has [1,58]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.751835823s of 10.814554214s, submitted: 28
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:36.555195+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 35 sent 33 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:06.435238+0000 osd.1 (osd.1) 34 : cluster [DBG] 3.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:06.445746+0000 osd.1 (osd.1) 35 : cluster [DBG] 3.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 58 heartbeat osd_stat(store_statfs(0x4fe0b2000/0x0/0x4ffc00000, data 0xb2947/0x112000, compress 0x0/0x0/0x0, omap 0xa1a0, meta 0x1a25e60), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 35)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:06.435238+0000 osd.1 (osd.1) 34 : cluster [DBG] 3.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:06.445746+0000 osd.1 (osd.1) 35 : cluster [DBG] 3.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65658880 unmapped: 393216 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:37.555452+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 58 heartbeat osd_stat(store_statfs(0x4fe0b8000/0x0/0x4ffc00000, data 0xb2947/0x112000, compress 0x0/0x0/0x0, omap 0xa1a0, meta 0x1a25e60), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65658880 unmapped: 393216 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:38.555669+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 37 sent 35 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:08.394140+0000 osd.1 (osd.1) 36 : cluster [DBG] 7.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:08.404527+0000 osd.1 (osd.1) 37 : cluster [DBG] 7.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65667072 unmapped: 385024 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 37)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:08.394140+0000 osd.1 (osd.1) 36 : cluster [DBG] 7.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:08.404527+0000 osd.1 (osd.1) 37 : cluster [DBG] 7.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 58 handle_osd_map epochs [59,59], i have 58, src has [1,59]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d(unlocked)] enter Initial
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=0 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000200 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=0 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000041 1 0.000090
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000343 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000334 1 0.000568
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetLog 0.001448 2 0.000134
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetMissing 0.000023 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 59 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:39.555884+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 39 sent 37 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:09.368822+0000 osd.1 (osd.1) 38 : cluster [DBG] 7.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:09.379373+0000 osd.1 (osd.1) 39 : cluster [DBG] 7.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 423563 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65691648 unmapped: 1409024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 39)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:09.368822+0000 osd.1 (osd.1) 38 : cluster [DBG] 7.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:09.379373+0000 osd.1 (osd.1) 39 : cluster [DBG] 7.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 59 handle_osd_map epochs [59,60], i have 59, src has [1,60]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 59 handle_osd_map epochs [59,60], i have 60, src has [1,60]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/WaitUpThru 0.903401 2 0.000196
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering 0.905379 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 0'0 unknown m=2 mbc={}] enter Started/Primary/Active
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Activating
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/Activating 0.002931 3 0.000214
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000102 1 0.000052
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000036 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.068030 3 0.000195
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000023 0 0.000000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=59/60 n=1 ec=37/21 lis/c=59/48 les/c/f=60/49/0 sis=59) [1] r=0 lpr=59 pi=[48,59)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 60 heartbeat osd_stat(store_statfs(0x4fe0b5000/0x0/0x4ffc00000, data 0xb3f5d/0x115000, compress 0x0/0x0/0x0, omap 0xa417, meta 0x1a25be9), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:40.556050+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 41 sent 39 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:10.340034+0000 osd.1 (osd.1) 40 : cluster [DBG] 3.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:10.350518+0000 osd.1 (osd.1) 41 : cluster [DBG] 3.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65716224 unmapped: 1384448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 41)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:10.340034+0000 osd.1 (osd.1) 40 : cluster [DBG] 3.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:10.350518+0000 osd.1 (osd.1) 41 : cluster [DBG] 3.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:41.556225+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65716224 unmapped: 1384448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 60 handle_osd_map epochs [60,61], i have 60, src has [1,61]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:42.556401+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 43 sent 41 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:12.345363+0000 osd.1 (osd.1) 42 : cluster [DBG] 3.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:12.355874+0000 osd.1 (osd.1) 43 : cluster [DBG] 3.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65724416 unmapped: 1376256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 43)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:12.345363+0000 osd.1 (osd.1) 42 : cluster [DBG] 3.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:12.355874+0000 osd.1 (osd.1) 43 : cluster [DBG] 3.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:43.557203+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 61 heartbeat osd_stat(store_statfs(0x4fe0ae000/0x0/0x4ffc00000, data 0xb6bd7/0x11c000, compress 0x0/0x0/0x0, omap 0xaaa5, meta 0x1a2555b), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65732608 unmapped: 1368064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:44.557354+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 45 sent 43 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:14.336332+0000 osd.1 (osd.1) 44 : cluster [DBG] 3.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:14.346871+0000 osd.1 (osd.1) 45 : cluster [DBG] 3.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 438758 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 61 handle_osd_map epochs [62,62], i have 61, src has [1,62]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 1351680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:45.557588+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 45)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:14.336332+0000 osd.1 (osd.1) 44 : cluster [DBG] 3.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:14.346871+0000 osd.1 (osd.1) 45 : cluster [DBG] 3.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 1351680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:46.557738+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 1351680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:47.557863+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65757184 unmapped: 1343488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ab000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:48.558048+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ab000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65757184 unmapped: 1343488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ab000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:49.558226+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 442250 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65765376 unmapped: 1335296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ab000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:50.558422+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65765376 unmapped: 1335296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:51.558558+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65765376 unmapped: 1335296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ab000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:52.558753+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ab000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 1318912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:53.558892+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 1318912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 17.895381927s of 17.993841171s, submitted: 24
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ab000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:54.559030+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 47 sent 45 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:24.255130+0000 osd.1 (osd.1) 46 : cluster [DBG] 3.0 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:24.265696+0000 osd.1 (osd.1) 47 : cluster [DBG] 3.0 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 443941 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65798144 unmapped: 1302528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 47)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:24.255130+0000 osd.1 (osd.1) 46 : cluster [DBG] 3.0 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:24.265696+0000 osd.1 (osd.1) 47 : cluster [DBG] 3.0 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:55.559292+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65806336 unmapped: 1294336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:56.559453+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65806336 unmapped: 1294336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:57.559592+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65814528 unmapped: 1286144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.0 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:58.559742+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 49 sent 47 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:28.269816+0000 osd.1 (osd.1) 48 : cluster [DBG] 7.0 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:28.280174+0000 osd.1 (osd.1) 49 : cluster [DBG] 7.0 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65814528 unmapped: 1286144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 49)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:28.269816+0000 osd.1 (osd.1) 48 : cluster [DBG] 7.0 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:28.280174+0000 osd.1 (osd.1) 49 : cluster [DBG] 7.0 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:59.559917+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 446352 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65822720 unmapped: 1277952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:00.560122+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 1269760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:01.560328+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 1269760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:02.560492+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 1261568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:03.560667+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 1261568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.989287376s of 10.003664970s, submitted: 5
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:04.560874+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 51 sent 49 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:34.248037+0000 osd.1 (osd.1) 50 : cluster [DBG] 3.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:34.258804+0000 osd.1 (osd.1) 51 : cluster [DBG] 3.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 448763 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65863680 unmapped: 1236992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 51)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:34.248037+0000 osd.1 (osd.1) 50 : cluster [DBG] 3.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:34.258804+0000 osd.1 (osd.1) 51 : cluster [DBG] 3.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:05.561179+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65863680 unmapped: 1236992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:06.561366+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 53 sent 51 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:36.255327+0000 osd.1 (osd.1) 52 : cluster [DBG] 7.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:36.265818+0000 osd.1 (osd.1) 53 : cluster [DBG] 7.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65888256 unmapped: 1212416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 53)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:36.255327+0000 osd.1 (osd.1) 52 : cluster [DBG] 7.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:36.265818+0000 osd.1 (osd.1) 53 : cluster [DBG] 7.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:07.561584+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65888256 unmapped: 1212416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:08.561800+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65888256 unmapped: 1212416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:09.561978+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 451174 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65896448 unmapped: 1204224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:10.562152+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65896448 unmapped: 1204224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:11.562383+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65896448 unmapped: 1204224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:12.562527+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65904640 unmapped: 1196032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:13.562667+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65904640 unmapped: 1196032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:14.562820+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 451174 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65912832 unmapped: 1187840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.963445663s of 10.970324516s, submitted: 3
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:15.562979+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:45.229158+0000 osd.1 (osd.1) 54 : cluster [DBG] 7.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:45.239709+0000 osd.1 (osd.1) 55 : cluster [DBG] 7.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65904640 unmapped: 1196032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 55)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:45.229158+0000 osd.1 (osd.1) 54 : cluster [DBG] 7.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:45.239709+0000 osd.1 (osd.1) 55 : cluster [DBG] 7.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:16.563162+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65912832 unmapped: 1187840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:17.563334+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 1155072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:18.563539+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:48.193274+0000 osd.1 (osd.1) 56 : cluster [DBG] 3.1c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:48.203823+0000 osd.1 (osd.1) 57 : cluster [DBG] 3.1c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 1155072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 57)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:48.193274+0000 osd.1 (osd.1) 56 : cluster [DBG] 3.1c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:48.203823+0000 osd.1 (osd.1) 57 : cluster [DBG] 3.1c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:19.563812+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 455998 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:20.563960+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:21.564149+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:22.564328+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 59 sent 57 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:52.224982+0000 osd.1 (osd.1) 58 : cluster [DBG] 7.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:52.235523+0000 osd.1 (osd.1) 59 : cluster [DBG] 7.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 65978368 unmapped: 1122304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 59)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:52.224982+0000 osd.1 (osd.1) 58 : cluster [DBG] 7.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:52.235523+0000 osd.1 (osd.1) 59 : cluster [DBG] 7.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:23.564540+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 61 sent 59 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:53.258275+0000 osd.1 (osd.1) 60 : cluster [DBG] 4.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:53.268853+0000 osd.1 (osd.1) 61 : cluster [DBG] 4.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66002944 unmapped: 1097728 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 61)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:53.258275+0000 osd.1 (osd.1) 60 : cluster [DBG] 4.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:53.268853+0000 osd.1 (osd.1) 61 : cluster [DBG] 4.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:24.564811+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 460822 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:25.564953+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:26.565149+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.175627708s of 12.195529938s, submitted: 8
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:27.565318+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 63 sent 61 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:57.424710+0000 osd.1 (osd.1) 62 : cluster [DBG] 4.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:11:57.435116+0000 osd.1 (osd.1) 63 : cluster [DBG] 4.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 1081344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 63)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:57.424710+0000 osd.1 (osd.1) 62 : cluster [DBG] 4.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:11:57.435116+0000 osd.1 (osd.1) 63 : cluster [DBG] 4.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:28.565543+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 1081344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:29.565649+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 463233 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66027520 unmapped: 1073152 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:30.565795+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66027520 unmapped: 1073152 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:31.565925+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66035712 unmapped: 1064960 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:32.566065+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:02.513830+0000 osd.1 (osd.1) 64 : cluster [DBG] 4.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:02.524407+0000 osd.1 (osd.1) 65 : cluster [DBG] 4.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66035712 unmapped: 1064960 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 65)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:02.513830+0000 osd.1 (osd.1) 64 : cluster [DBG] 4.14 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:02.524407+0000 osd.1 (osd.1) 65 : cluster [DBG] 4.14 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:33.566291+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:03.488760+0000 osd.1 (osd.1) 66 : cluster [DBG] 4.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:03.499242+0000 osd.1 (osd.1) 67 : cluster [DBG] 4.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66035712 unmapped: 1064960 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 67)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:03.488760+0000 osd.1 (osd.1) 66 : cluster [DBG] 4.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:03.499242+0000 osd.1 (osd.1) 67 : cluster [DBG] 4.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:34.566467+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 468059 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66043904 unmapped: 1056768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:35.566644+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66043904 unmapped: 1056768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:36.566789+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66052096 unmapped: 1048576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:37.566907+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66052096 unmapped: 1048576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:38.567129+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66060288 unmapped: 1040384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:39.567336+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 468059 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66068480 unmapped: 1032192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:40.567482+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66076672 unmapped: 1024000 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.061655045s of 14.075113297s, submitted: 6
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:41.567722+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:11.499858+0000 osd.1 (osd.1) 68 : cluster [DBG] 2.1b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:11.510427+0000 osd.1 (osd.1) 69 : cluster [DBG] 2.1b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 69)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:11.499858+0000 osd.1 (osd.1) 68 : cluster [DBG] 2.1b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:11.510427+0000 osd.1 (osd.1) 69 : cluster [DBG] 2.1b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:42.567989+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:43.568117+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 1 last_log 70 sent 69 num 1 unsent 1 sending 1
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:13.562030+0000 osd.1 (osd.1) 70 : cluster [DBG] 5.11 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 70)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:13.562030+0000 osd.1 (osd.1) 70 : cluster [DBG] 5.11 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66093056 unmapped: 1007616 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:44.568301+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 1 last_log 71 sent 70 num 1 unsent 1 sending 1
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:13.572546+0000 osd.1 (osd.1) 71 : cluster [DBG] 5.11 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 71)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:13.572546+0000 osd.1 (osd.1) 71 : cluster [DBG] 5.11 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 472885 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66101248 unmapped: 999424 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:45.568522+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:15.554886+0000 osd.1 (osd.1) 72 : cluster [DBG] 4.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:15.565426+0000 osd.1 (osd.1) 73 : cluster [DBG] 4.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 73)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:15.554886+0000 osd.1 (osd.1) 72 : cluster [DBG] 4.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:15.565426+0000 osd.1 (osd.1) 73 : cluster [DBG] 4.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66101248 unmapped: 999424 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:46.568730+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:16.552950+0000 osd.1 (osd.1) 74 : cluster [DBG] 4.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:16.563498+0000 osd.1 (osd.1) 75 : cluster [DBG] 4.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 75)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:16.552950+0000 osd.1 (osd.1) 74 : cluster [DBG] 4.10 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:16.563498+0000 osd.1 (osd.1) 75 : cluster [DBG] 4.10 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 991232 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:47.568897+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 991232 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:48.569047+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66117632 unmapped: 983040 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:49.569205+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 477709 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:50.569342+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:51.569495+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66134016 unmapped: 966656 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:52.569610+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66134016 unmapped: 966656 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.014001846s of 12.029939651s, submitted: 8
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:53.569751+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:23.529817+0000 osd.1 (osd.1) 76 : cluster [DBG] 2.17 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:23.540346+0000 osd.1 (osd.1) 77 : cluster [DBG] 2.17 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 77)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:23.529817+0000 osd.1 (osd.1) 76 : cluster [DBG] 2.17 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:23.540346+0000 osd.1 (osd.1) 77 : cluster [DBG] 2.17 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:54.569962+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 480122 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:55.570081+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:56.570241+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:26.538444+0000 osd.1 (osd.1) 78 : cluster [DBG] 4.f scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:26.548858+0000 osd.1 (osd.1) 79 : cluster [DBG] 4.f scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66174976 unmapped: 925696 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 79)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:26.538444+0000 osd.1 (osd.1) 78 : cluster [DBG] 4.f scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:26.548858+0000 osd.1 (osd.1) 79 : cluster [DBG] 4.f scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:57.570418+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66183168 unmapped: 917504 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:58.570650+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66183168 unmapped: 917504 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:59.570760+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:29.547396+0000 osd.1 (osd.1) 80 : cluster [DBG] 2.15 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:29.557980+0000 osd.1 (osd.1) 81 : cluster [DBG] 2.15 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 484946 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 909312 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 81)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:29.547396+0000 osd.1 (osd.1) 80 : cluster [DBG] 2.15 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:29.557980+0000 osd.1 (osd.1) 81 : cluster [DBG] 2.15 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:00.570919+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 909312 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:01.571089+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66199552 unmapped: 901120 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:02.571261+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:32.532141+0000 osd.1 (osd.1) 82 : cluster [DBG] 5.13 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:32.542720+0000 osd.1 (osd.1) 83 : cluster [DBG] 5.13 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66207744 unmapped: 892928 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 83)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:32.532141+0000 osd.1 (osd.1) 82 : cluster [DBG] 5.13 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:32.542720+0000 osd.1 (osd.1) 83 : cluster [DBG] 5.13 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:03.571557+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:04.571759+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 487359 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.013473511s of 12.029477119s, submitted: 8
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:05.571907+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:35.559264+0000 osd.1 (osd.1) 84 : cluster [DBG] 5.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:35.569835+0000 osd.1 (osd.1) 85 : cluster [DBG] 5.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 85)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:35.559264+0000 osd.1 (osd.1) 84 : cluster [DBG] 5.12 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:35.569835+0000 osd.1 (osd.1) 85 : cluster [DBG] 5.12 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:06.572110+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66224128 unmapped: 876544 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:07.572335+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66232320 unmapped: 868352 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:08.572535+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66240512 unmapped: 860160 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:09.572702+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:39.545417+0000 osd.1 (osd.1) 86 : cluster [DBG] 5.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:39.555959+0000 osd.1 (osd.1) 87 : cluster [DBG] 5.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 492183 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66248704 unmapped: 851968 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 87)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:39.545417+0000 osd.1 (osd.1) 86 : cluster [DBG] 5.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:39.555959+0000 osd.1 (osd.1) 87 : cluster [DBG] 5.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:10.572910+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:40.535970+0000 osd.1 (osd.1) 88 : cluster [DBG] 5.16 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:40.546521+0000 osd.1 (osd.1) 89 : cluster [DBG] 5.16 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66256896 unmapped: 843776 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 89)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:40.535970+0000 osd.1 (osd.1) 88 : cluster [DBG] 5.16 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:40.546521+0000 osd.1 (osd.1) 89 : cluster [DBG] 5.16 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:11.573129+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66265088 unmapped: 835584 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:12.573291+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66265088 unmapped: 835584 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:13.573436+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:43.429237+0000 osd.1 (osd.1) 90 : cluster [DBG] 2.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:43.439861+0000 osd.1 (osd.1) 91 : cluster [DBG] 2.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 91)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:43.429237+0000 osd.1 (osd.1) 90 : cluster [DBG] 2.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:43.439861+0000 osd.1 (osd.1) 91 : cluster [DBG] 2.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:14.573686+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 497007 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:15.573866+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:16.573980+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66297856 unmapped: 802816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.861497879s of 11.878818512s, submitted: 8
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:17.574161+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:47.438326+0000 osd.1 (osd.1) 92 : cluster [DBG] 4.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:47.448771+0000 osd.1 (osd.1) 93 : cluster [DBG] 4.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66297856 unmapped: 802816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 93)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:47.438326+0000 osd.1 (osd.1) 92 : cluster [DBG] 4.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:47.448771+0000 osd.1 (osd.1) 93 : cluster [DBG] 4.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:18.574396+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66306048 unmapped: 794624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:19.574606+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:49.419861+0000 osd.1 (osd.1) 94 : cluster [DBG] 2.3 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:49.430311+0000 osd.1 (osd.1) 95 : cluster [DBG] 2.3 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 501829 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66306048 unmapped: 794624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 95)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:49.419861+0000 osd.1 (osd.1) 94 : cluster [DBG] 2.3 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:49.430311+0000 osd.1 (osd.1) 95 : cluster [DBG] 2.3 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:20.574830+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66306048 unmapped: 794624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:21.574983+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66314240 unmapped: 786432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:22.575201+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66314240 unmapped: 786432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:23.575341+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66330624 unmapped: 770048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:24.575508+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 501829 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66338816 unmapped: 761856 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:25.575693+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:55.425956+0000 osd.1 (osd.1) 96 : cluster [DBG] 4.5 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:55.436538+0000 osd.1 (osd.1) 97 : cluster [DBG] 4.5 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 97)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:55.425956+0000 osd.1 (osd.1) 96 : cluster [DBG] 4.5 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:55.436538+0000 osd.1 (osd.1) 97 : cluster [DBG] 4.5 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66347008 unmapped: 753664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:26.575965+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66347008 unmapped: 753664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:27.576174+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66347008 unmapped: 753664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:28.576452+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.959852219s of 11.973832130s, submitted: 6
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:29.576670+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:59.412221+0000 osd.1 (osd.1) 98 : cluster [DBG] 2.a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:59.422160+0000 osd.1 (osd.1) 99 : cluster [DBG] 2.a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 99)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:59.412221+0000 osd.1 (osd.1) 98 : cluster [DBG] 2.a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:59.422160+0000 osd.1 (osd.1) 99 : cluster [DBG] 2.a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 506651 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:30.576966+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:31.577112+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:32.577255+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:33.577385+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66404352 unmapped: 696320 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:34.577506+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 506651 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66412544 unmapped: 688128 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:35.577692+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66420736 unmapped: 679936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:36.577892+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66420736 unmapped: 679936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:37.578176+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66428928 unmapped: 671744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:38.578500+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66437120 unmapped: 663552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:39.578784+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 506651 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66437120 unmapped: 663552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:40.578977+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.087844849s of 12.090794563s, submitted: 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:41.579234+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:11.503004+0000 osd.1 (osd.1) 100 : cluster [DBG] 4.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:11.513501+0000 osd.1 (osd.1) 101 : cluster [DBG] 4.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 101)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:11.503004+0000 osd.1 (osd.1) 100 : cluster [DBG] 4.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:11.513501+0000 osd.1 (osd.1) 101 : cluster [DBG] 4.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:42.579503+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:43.579698+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:44.579842+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 509062 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:45.580007+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66469888 unmapped: 630784 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:46.580113+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:16.522398+0000 osd.1 (osd.1) 102 : cluster [DBG] 2.5 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:16.532978+0000 osd.1 (osd.1) 103 : cluster [DBG] 2.5 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 103)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:16.522398+0000 osd.1 (osd.1) 102 : cluster [DBG] 2.5 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:16.532978+0000 osd.1 (osd.1) 103 : cluster [DBG] 2.5 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:47.580344+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66502656 unmapped: 598016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:48.580614+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:18.457633+0000 osd.1 (osd.1) 104 : cluster [DBG] 2.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:18.468178+0000 osd.1 (osd.1) 105 : cluster [DBG] 2.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 105)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:18.457633+0000 osd.1 (osd.1) 104 : cluster [DBG] 2.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:18.468178+0000 osd.1 (osd.1) 105 : cluster [DBG] 2.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66510848 unmapped: 589824 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:49.580815+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:19.466828+0000 osd.1 (osd.1) 106 : cluster [DBG] 2.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:19.477343+0000 osd.1 (osd.1) 107 : cluster [DBG] 2.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 107)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:19.466828+0000 osd.1 (osd.1) 106 : cluster [DBG] 2.7 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:19.477343+0000 osd.1 (osd.1) 107 : cluster [DBG] 2.7 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 518706 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 581632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:50.580998+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:20.494924+0000 osd.1 (osd.1) 108 : cluster [DBG] 5.1 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:20.505515+0000 osd.1 (osd.1) 109 : cluster [DBG] 5.1 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 109)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:20.494924+0000 osd.1 (osd.1) 108 : cluster [DBG] 5.1 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:20.505515+0000 osd.1 (osd.1) 109 : cluster [DBG] 5.1 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66527232 unmapped: 573440 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:51.581155+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.991565704s of 11.008629799s, submitted: 10
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:52.581279+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:22.511695+0000 osd.1 (osd.1) 110 : cluster [DBG] 2.6 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:22.522187+0000 osd.1 (osd.1) 111 : cluster [DBG] 2.6 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66535424 unmapped: 565248 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 111)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:22.511695+0000 osd.1 (osd.1) 110 : cluster [DBG] 2.6 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:22.522187+0000 osd.1 (osd.1) 111 : cluster [DBG] 2.6 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.f scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.f scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:53.581460+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:23.523664+0000 osd.1 (osd.1) 112 : cluster [DBG] 5.f scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:23.534195+0000 osd.1 (osd.1) 113 : cluster [DBG] 5.f scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 113)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:23.523664+0000 osd.1 (osd.1) 112 : cluster [DBG] 5.f scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:23.534195+0000 osd.1 (osd.1) 113 : cluster [DBG] 5.f scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:54.581671+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 523528 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:55.581803+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66551808 unmapped: 548864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:56.581965+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:57.582189+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:58.582354+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 115 sent 113 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:28.530907+0000 osd.1 (osd.1) 114 : cluster [DBG] 5.c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:28.541418+0000 osd.1 (osd.1) 115 : cluster [DBG] 5.c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66576384 unmapped: 524288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:59.582530+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 4 last_log 117 sent 115 num 4 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:29.520775+0000 osd.1 (osd.1) 116 : cluster [DBG] 5.1d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:29.531381+0000 osd.1 (osd.1) 117 : cluster [DBG] 5.1d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66576384 unmapped: 524288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 115)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:28.530907+0000 osd.1 (osd.1) 114 : cluster [DBG] 5.c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:28.541418+0000 osd.1 (osd.1) 115 : cluster [DBG] 5.c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 528352 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:00.582778+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66576384 unmapped: 524288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 117)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:29.520775+0000 osd.1 (osd.1) 116 : cluster [DBG] 5.1d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:29.531381+0000 osd.1 (osd.1) 117 : cluster [DBG] 5.1d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:01.582931+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 119 sent 117 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:31.514449+0000 osd.1 (osd.1) 118 : cluster [DBG] 5.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:31.524985+0000 osd.1 (osd.1) 119 : cluster [DBG] 5.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 119)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:31.514449+0000 osd.1 (osd.1) 118 : cluster [DBG] 5.19 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:31.524985+0000 osd.1 (osd.1) 119 : cluster [DBG] 5.19 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:02.583175+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.000996590s of 11.020680428s, submitted: 10
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:03.583342+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 121 sent 119 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:33.532373+0000 osd.1 (osd.1) 120 : cluster [DBG] 2.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:33.542789+0000 osd.1 (osd.1) 121 : cluster [DBG] 2.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 121)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:33.532373+0000 osd.1 (osd.1) 120 : cluster [DBG] 2.9 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:33.542789+0000 osd.1 (osd.1) 121 : cluster [DBG] 2.9 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:04.583620+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 535589 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:05.583760+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 123 sent 121 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:35.487393+0000 osd.1 (osd.1) 122 : cluster [DBG] 5.1a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:35.498089+0000 osd.1 (osd.1) 123 : cluster [DBG] 5.1a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 123)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:35.487393+0000 osd.1 (osd.1) 122 : cluster [DBG] 5.1a scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:35.498089+0000 osd.1 (osd.1) 123 : cluster [DBG] 5.1a scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:06.583993+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:07.584133+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:08.584430+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 125 sent 123 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:38.562295+0000 osd.1 (osd.1) 124 : cluster [DBG] 5.18 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:38.572881+0000 osd.1 (osd.1) 125 : cluster [DBG] 5.18 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 125)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:38.562295+0000 osd.1 (osd.1) 124 : cluster [DBG] 5.18 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:38.572881+0000 osd.1 (osd.1) 125 : cluster [DBG] 5.18 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:09.584680+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:10.584943+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 127 sent 125 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:40.547425+0000 osd.1 (osd.1) 126 : cluster [DBG] 6.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:40.572136+0000 osd.1 (osd.1) 127 : cluster [DBG] 6.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540413 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 127)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:40.547425+0000 osd.1 (osd.1) 126 : cluster [DBG] 6.4 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:40.572136+0000 osd.1 (osd.1) 127 : cluster [DBG] 6.4 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:11.585164+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:12.585283+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 129 sent 127 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:42.465943+0000 osd.1 (osd.1) 128 : cluster [DBG] 6.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:42.480111+0000 osd.1 (osd.1) 129 : cluster [DBG] 6.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 129)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:42.465943+0000 osd.1 (osd.1) 128 : cluster [DBG] 6.b scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:42.480111+0000 osd.1 (osd.1) 129 : cluster [DBG] 6.b scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:13.585488+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 131 sent 129 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:43.467445+0000 osd.1 (osd.1) 130 : cluster [DBG] 6.1 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:43.477985+0000 osd.1 (osd.1) 131 : cluster [DBG] 6.1 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.957039833s of 10.982520103s, submitted: 12
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 131)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:43.467445+0000 osd.1 (osd.1) 130 : cluster [DBG] 6.1 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:43.477985+0000 osd.1 (osd.1) 131 : cluster [DBG] 6.1 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:14.585787+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 133 sent 131 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:44.514975+0000 osd.1 (osd.1) 132 : cluster [DBG] 6.6 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:44.529090+0000 osd.1 (osd.1) 133 : cluster [DBG] 6.6 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 133)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:44.514975+0000 osd.1 (osd.1) 132 : cluster [DBG] 6.6 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:44.529090+0000 osd.1 (osd.1) 133 : cluster [DBG] 6.6 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:15.585978+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 135 sent 133 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:45.562112+0000 osd.1 (osd.1) 134 : cluster [DBG] 6.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:45.572685+0000 osd.1 (osd.1) 135 : cluster [DBG] 6.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 550057 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 135)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:45.562112+0000 osd.1 (osd.1) 134 : cluster [DBG] 6.2 scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:45.572685+0000 osd.1 (osd.1) 135 : cluster [DBG] 6.2 scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:16.586159+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 137 sent 135 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:46.554453+0000 osd.1 (osd.1) 136 : cluster [DBG] 6.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:46.572068+0000 osd.1 (osd.1) 137 : cluster [DBG] 6.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 137)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:46.554453+0000 osd.1 (osd.1) 136 : cluster [DBG] 6.d scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:46.572068+0000 osd.1 (osd.1) 137 : cluster [DBG] 6.d scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:17.586370+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 139 sent 137 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:47.542428+0000 osd.1 (osd.1) 138 : cluster [DBG] 6.c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:47.556620+0000 osd.1 (osd.1) 139 : cluster [DBG] 6.c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 139)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:47.542428+0000 osd.1 (osd.1) 138 : cluster [DBG] 6.c scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:47.556620+0000 osd.1 (osd.1) 139 : cluster [DBG] 6.c scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:18.586645+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:19.586800+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:20.586962+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 554879 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:21.587096+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 141 sent 139 num 2 unsent 2 sending 2
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:51.549371+0000 osd.1 (osd.1) 140 : cluster [DBG] 6.e scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:51.563550+0000 osd.1 (osd.1) 141 : cluster [DBG] 6.e scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 141)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:51.549371+0000 osd.1 (osd.1) 140 : cluster [DBG] 6.e scrub starts
Dec 03 21:34:42 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:51.563550+0000 osd.1 (osd.1) 141 : cluster [DBG] 6.e scrub ok
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:22.587311+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:23.587510+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:24.587732+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66666496 unmapped: 434176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:25.587931+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66666496 unmapped: 434176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:26.588138+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:27.588293+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:28.588469+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:29.588701+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:30.588874+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:31.589026+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:32.589183+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:33.589369+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:34.589488+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:35.589636+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:36.589767+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:37.589964+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:38.590168+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:39.590319+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:40.590455+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:41.590650+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:42.590934+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:43.591114+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:44.591289+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:45.591505+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:46.591657+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:47.591783+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:48.591966+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:49.592096+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66740224 unmapped: 360448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:50.592213+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66740224 unmapped: 360448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:51.592353+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66748416 unmapped: 352256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:52.592494+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:53.592693+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:54.592858+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:55.592986+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:56.593146+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:57.593314+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:58.593497+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:59.593645+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:00.593813+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66797568 unmapped: 303104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:01.593958+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66797568 unmapped: 303104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:02.594145+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66805760 unmapped: 294912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:03.594302+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66813952 unmapped: 286720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:04.594452+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66813952 unmapped: 286720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:05.594584+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:06.594757+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:07.594898+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:08.595098+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:09.595284+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:10.595484+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:11.595676+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:12.595835+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:13.596053+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:14.596306+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:15.596519+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:16.596668+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:17.596848+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:18.597029+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:19.597197+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:20.597369+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:21.597526+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:22.597634+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:23.597784+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:24.597955+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:25.598067+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:26.598203+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66904064 unmapped: 196608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:27.598394+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66904064 unmapped: 196608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:28.598640+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66904064 unmapped: 196608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:29.598826+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:30.599058+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:31.599451+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:32.599631+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66920448 unmapped: 180224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:33.599819+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66920448 unmapped: 180224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:34.599973+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:35.600175+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66945024 unmapped: 155648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:36.600345+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66945024 unmapped: 155648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:37.600528+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:38.600738+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:39.600901+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:40.601080+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:41.601213+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:42.601376+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:43.601532+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:44.601691+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:45.601829+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 122880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:46.601997+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 122880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:47.602113+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:48.602363+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:49.602522+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:50.602744+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:51.602947+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:52.603140+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:53.603300+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67002368 unmapped: 98304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:54.603481+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67002368 unmapped: 98304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:55.603640+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:56.603809+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:57.603957+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67043328 unmapped: 57344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:58.604162+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67043328 unmapped: 57344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:59.604340+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67043328 unmapped: 57344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:00.604466+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67051520 unmapped: 49152 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:01.604709+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67051520 unmapped: 49152 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:02.604879+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67051520 unmapped: 49152 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:03.605051+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67059712 unmapped: 40960 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:04.605207+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67059712 unmapped: 40960 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:05.605372+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 32768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:06.605503+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 32768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:07.605663+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 32768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:08.605860+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67076096 unmapped: 24576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:09.606037+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67076096 unmapped: 24576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:10.606180+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67084288 unmapped: 16384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:11.606385+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67084288 unmapped: 16384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:12.606536+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 8192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:13.606671+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 8192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:14.606869+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 8192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:15.606996+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:16.607138+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:17.607300+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:18.607492+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:19.607655+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:20.607813+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:21.607961+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:22.608095+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:23.608237+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67125248 unmapped: 1024000 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:24.608388+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67125248 unmapped: 1024000 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:25.608549+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67125248 unmapped: 1024000 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:26.608687+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 1015808 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:27.608799+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 1015808 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:28.609003+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67141632 unmapped: 1007616 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:29.609139+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67141632 unmapped: 1007616 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:30.609303+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67149824 unmapped: 999424 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:31.609457+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:32.609642+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:33.609827+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:34.610007+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:35.610217+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:36.610363+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:37.610514+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:38.610736+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:39.610861+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:40.611069+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:41.611198+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:42.611319+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:43.611500+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:44.611619+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:45.611820+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:46.612010+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:47.612146+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:48.612438+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:49.612657+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:50.612846+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:51.613007+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67223552 unmapped: 925696 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:52.613136+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67223552 unmapped: 925696 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:53.613341+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:54.613468+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:55.613614+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:56.613732+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67239936 unmapped: 909312 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:57.613869+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67239936 unmapped: 909312 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:58.614054+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:59.614230+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:00.614398+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:01.614560+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67256320 unmapped: 892928 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:02.614745+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67256320 unmapped: 892928 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:03.614865+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:04.615001+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:05.615155+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:06.615277+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:07.615438+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:08.615645+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:09.615799+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:10.615932+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:11.616062+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67289088 unmapped: 860160 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:12.616191+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67289088 unmapped: 860160 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:13.616416+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:14.616645+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:15.616814+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:16.616944+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:17.617185+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:18.617402+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:19.617528+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 835584 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:20.617623+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 835584 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:21.617762+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:22.617894+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:23.618038+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:24.618159+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67330048 unmapped: 819200 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:25.618308+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67330048 unmapped: 819200 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:26.618465+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67338240 unmapped: 811008 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:27.618661+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67338240 unmapped: 811008 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:28.618819+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67346432 unmapped: 802816 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:29.618941+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67346432 unmapped: 802816 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:30.619171+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67346432 unmapped: 802816 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:31.619364+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 794624 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:32.619549+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 794624 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:33.619710+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:34.619925+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:35.620052+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:36.620180+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:37.620327+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:38.620545+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:39.620749+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:40.620961+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:41.621202+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:42.621436+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67387392 unmapped: 761856 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:43.621632+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67387392 unmapped: 761856 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:44.621800+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:45.621989+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:46.622172+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67403776 unmapped: 745472 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:47.622353+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67403776 unmapped: 745472 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:48.622651+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:49.622878+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:50.622996+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:51.623144+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:52.623403+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:53.623590+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:54.623850+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:55.624049+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:56.624339+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:57.624525+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:58.624828+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:59.625080+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:00.625320+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:01.625493+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:02.625787+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:03.625972+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:04.626168+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:05.626397+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:06.626613+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:07.626865+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67469312 unmapped: 679936 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:08.627062+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:09.627208+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67469312 unmapped: 679936 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:10.627383+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67477504 unmapped: 671744 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:11.627583+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67477504 unmapped: 671744 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:12.627932+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:13.628255+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:14.628509+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:15.628732+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:16.628994+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:17.629213+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:18.629533+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:19.629899+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:20.630155+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67510272 unmapped: 638976 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:21.630729+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67510272 unmapped: 638976 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:22.630897+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67510272 unmapped: 638976 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:23.631190+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67518464 unmapped: 630784 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:24.631417+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67518464 unmapped: 630784 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:25.631645+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67518464 unmapped: 630784 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:26.631847+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 622592 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:27.632062+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 622592 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:28.632333+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67534848 unmapped: 614400 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:29.632489+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67534848 unmapped: 614400 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:30.632702+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 598016 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:31.632835+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 598016 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:32.632973+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 598016 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:33.633155+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 598016 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:34.633290+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67559424 unmapped: 589824 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:35.633457+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67559424 unmapped: 589824 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:36.633648+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67567616 unmapped: 581632 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:37.633849+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67567616 unmapped: 581632 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:38.634085+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67575808 unmapped: 573440 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:39.634234+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67575808 unmapped: 573440 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:40.634386+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67584000 unmapped: 565248 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:41.634608+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67584000 unmapped: 565248 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:42.634776+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67584000 unmapped: 565248 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:43.634997+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67592192 unmapped: 557056 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:44.635201+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67592192 unmapped: 557056 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:45.635359+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67600384 unmapped: 548864 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:46.635643+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67600384 unmapped: 548864 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:47.635844+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67600384 unmapped: 548864 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:48.636025+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67608576 unmapped: 540672 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:49.636158+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67608576 unmapped: 540672 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:50.636333+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67608576 unmapped: 540672 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:51.636710+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67616768 unmapped: 532480 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:52.636858+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67608576 unmapped: 540672 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:53.636987+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67616768 unmapped: 532480 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:54.637117+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67616768 unmapped: 532480 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:55.637487+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67616768 unmapped: 532480 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:56.637791+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67624960 unmapped: 524288 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:57.638071+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67624960 unmapped: 524288 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:58.638450+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67633152 unmapped: 516096 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:59.638685+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67633152 unmapped: 516096 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:00.638869+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 507904 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:01.638990+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 507904 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:02.639150+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 507904 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:03.639328+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67649536 unmapped: 499712 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:04.639467+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67649536 unmapped: 499712 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:05.639681+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67649536 unmapped: 499712 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:06.639891+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67657728 unmapped: 491520 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:07.640152+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67657728 unmapped: 491520 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:08.640450+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67665920 unmapped: 483328 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:09.640664+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67665920 unmapped: 483328 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:10.640942+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 475136 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:11.641179+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 475136 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:12.641372+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 475136 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:13.641520+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67682304 unmapped: 466944 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:14.641812+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67682304 unmapped: 466944 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:15.642059+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 458752 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:16.643149+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 458752 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:17.643295+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 458752 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:18.643478+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67698688 unmapped: 450560 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:19.643642+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67698688 unmapped: 450560 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:20.643801+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67698688 unmapped: 450560 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:21.644014+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67706880 unmapped: 442368 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:22.644222+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67706880 unmapped: 442368 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:23.644454+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67715072 unmapped: 434176 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:24.644708+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67715072 unmapped: 434176 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:25.644888+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67715072 unmapped: 434176 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:26.645084+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67723264 unmapped: 425984 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:27.645201+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67723264 unmapped: 425984 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:28.645415+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67723264 unmapped: 425984 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:29.645607+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67731456 unmapped: 417792 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:30.645760+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67731456 unmapped: 417792 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:31.645928+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67739648 unmapped: 409600 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:32.646138+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67739648 unmapped: 409600 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:33.646270+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67739648 unmapped: 409600 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:34.646492+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67747840 unmapped: 401408 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:35.646737+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67747840 unmapped: 401408 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:36.646981+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 393216 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:37.647283+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 393216 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:38.647647+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 393216 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:39.647940+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67764224 unmapped: 385024 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:40.648189+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67764224 unmapped: 385024 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:41.648399+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67772416 unmapped: 376832 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:42.648676+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67772416 unmapped: 376832 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:43.648912+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67780608 unmapped: 368640 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 16.57 MB, 0.03 MB/s
                                           Interval WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:44.649216+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:45.649450+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67870720 unmapped: 278528 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:46.649793+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67878912 unmapped: 270336 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:47.650054+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67878912 unmapped: 270336 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:48.650259+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67878912 unmapped: 270336 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:49.650430+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67887104 unmapped: 262144 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:50.650727+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67887104 unmapped: 262144 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:51.650884+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 253952 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:52.651059+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 253952 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:53.651271+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 253952 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:54.651468+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67903488 unmapped: 245760 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:55.651662+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67903488 unmapped: 245760 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:56.651942+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 237568 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:57.652185+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 237568 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:58.652376+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 237568 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:59.652652+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 229376 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:00.652914+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 229376 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:01.653097+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67928064 unmapped: 221184 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:02.653278+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67928064 unmapped: 221184 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:03.653490+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67928064 unmapped: 221184 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:04.653739+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 212992 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:05.653926+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 212992 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:06.654177+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 204800 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:07.654471+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 204800 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:08.655194+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 204800 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:09.655463+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 196608 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:10.655769+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 196608 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:11.656059+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 196608 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:12.656294+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 188416 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:13.656607+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 188416 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:14.656846+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 180224 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:15.657081+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 180224 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:16.657346+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 180224 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:17.657537+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 172032 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:18.657783+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 172032 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:19.657990+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 163840 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:20.658196+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 163840 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:21.658347+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 163840 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:22.658487+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 155648 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:23.658663+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 155648 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:24.658865+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 147456 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:25.659060+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 147456 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:26.659264+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68009984 unmapped: 139264 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:27.659409+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68009984 unmapped: 139264 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:28.659660+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 131072 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:29.659836+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 131072 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:30.660001+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 131072 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:31.660196+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 122880 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:32.660332+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 122880 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:33.660487+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 122880 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:34.660650+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 114688 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:35.660841+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 114688 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:36.660964+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 114688 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:37.661104+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 106496 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:38.661315+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 106496 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:39.661476+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 106496 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:40.661611+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 98304 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:41.661796+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 98304 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:42.661957+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 90112 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:43.662089+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 90112 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:44.662238+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 81920 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:45.662394+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 81920 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:46.662610+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 81920 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:47.662745+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 73728 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:48.662927+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 73728 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:49.663155+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 65536 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:50.663336+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 65536 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:51.663617+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 65536 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:52.663792+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68091904 unmapped: 57344 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:53.664030+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68091904 unmapped: 57344 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:54.664239+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68100096 unmapped: 49152 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:55.664385+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68100096 unmapped: 49152 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:56.664643+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68100096 unmapped: 49152 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:57.664803+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68108288 unmapped: 40960 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:58.665014+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68108288 unmapped: 40960 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:59.665170+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 32768 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:00.665373+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 32768 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:01.665514+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 32768 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:02.665681+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 24576 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:03.665869+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 24576 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:04.666023+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 24576 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:05.666184+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 16384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:06.666326+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 16384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:07.666477+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 16384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:08.666665+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 8192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:09.666806+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 8192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:10.666957+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68149248 unmapped: 0 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:11.667107+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68149248 unmapped: 0 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:12.667274+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 1040384 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:13.667402+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 1040384 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:14.667603+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 1040384 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:15.667754+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68165632 unmapped: 1032192 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:16.667910+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68165632 unmapped: 1032192 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:17.668070+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 1024000 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:18.668260+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 1024000 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:19.668419+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 1024000 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:20.668602+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 1015808 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:21.669400+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 1015808 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:22.669654+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 1015808 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:23.669892+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 1007616 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:24.669985+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 1007616 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:25.670260+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 999424 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:26.670420+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 999424 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:27.670603+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 999424 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:28.670792+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 991232 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:29.670971+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 991232 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:30.671112+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 983040 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:31.671283+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 983040 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:32.671404+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 974848 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:33.671625+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 974848 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:34.671781+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 958464 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:35.672101+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 958464 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:36.672280+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 958464 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:37.672416+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 950272 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:38.672613+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 950272 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:39.672776+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 950272 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:40.672917+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 942080 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:41.673116+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 942080 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:42.673309+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 942080 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:43.673457+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:44.673603+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:45.673800+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:46.674015+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:47.674300+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:48.674431+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:49.674553+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:50.674681+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:51.674808+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:52.675056+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:53.675212+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:54.675369+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:55.675539+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:56.675695+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:57.676949+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:58.678221+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:59.678373+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:00.678706+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:01.678825+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:02.678979+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:03.679104+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:04.679254+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:05.679410+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:06.679662+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:07.679842+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:08.680007+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:09.680153+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:10.680358+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:11.681045+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:12.681182+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:13.681348+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:14.681488+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:15.681700+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:16.682238+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:17.682647+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:18.682908+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:19.683182+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:20.683357+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:21.683735+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:22.684077+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:23.684219+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:24.684348+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:25.684475+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:26.684633+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:27.684790+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:28.684955+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:29.685041+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:30.685212+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:31.685382+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:32.685562+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:33.685749+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:34.685901+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:35.686107+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:36.686261+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:37.686465+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:38.686644+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:39.686757+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:40.686879+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:41.686992+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:42.687154+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:43.687330+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:44.687486+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:45.687648+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:46.687796+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:47.687928+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:48.688178+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:49.688313+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:50.688424+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:51.688564+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:52.688753+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:53.688943+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:54.689133+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:55.689289+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:56.689447+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:57.689601+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:58.689722+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:59.689846+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:00.690000+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:01.690165+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:02.690294+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:03.690459+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:04.690678+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:05.690807+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:06.690958+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:07.691201+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:08.691351+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:09.691553+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:10.691757+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:11.691954+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:12.692128+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:13.692257+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:14.692395+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:15.692597+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:16.692812+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:17.693035+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:18.693257+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:19.693493+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:20.693695+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:21.693889+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:22.694086+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:23.694275+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:24.694507+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:25.694655+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:26.694882+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:27.695081+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:28.695334+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:29.695631+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:30.695805+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:31.695960+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:32.696112+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:33.696716+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:34.696909+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:35.697058+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:36.697229+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:37.697441+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:38.698202+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:39.698404+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:40.698613+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:41.698909+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:42.699174+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:43.699506+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:44.699838+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:45.700096+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:46.700357+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:47.700592+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:48.700806+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:49.701081+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:50.701323+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:51.701732+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:52.701975+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:53.702136+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:54.702305+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:55.702510+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:56.702669+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:57.702901+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:58.703149+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:59.703308+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:00.703518+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:01.703700+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:02.703872+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:03.704115+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:04.704330+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:05.704540+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:06.704855+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:07.705010+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:08.705338+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:09.705488+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:10.705677+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:11.705850+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:12.706118+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:13.706333+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:14.706502+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:15.706659+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:16.706879+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:17.707117+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:18.707458+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:19.707652+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:20.707814+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:21.707997+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:22.708215+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:23.708368+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:24.708523+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:25.708691+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:26.708810+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:27.708959+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:28.709158+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:29.709307+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:30.709516+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:31.709703+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:32.709854+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:33.710076+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:34.710284+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:35.710499+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:36.710739+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:37.710971+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:38.711207+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:39.711374+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:40.711512+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:41.711676+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:42.711854+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:43.712001+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:44.712144+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:45.712282+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:46.712486+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:47.712664+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:48.712874+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:49.713025+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: mgrc ms_handle_reset ms_handle_reset con 0x55cf1d7fe000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1553116858
Dec 03 21:34:42 compute-0 ceph-osd[87094]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1553116858,v1:192.168.122.100:6801/1553116858]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: get_auth_request con 0x55cf1e65f000 auth_method 0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: mgrc handle_mgr_configure stats_period=5
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68313088 unmapped: 884736 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:50.713212+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 ms_handle_reset con 0x55cf1ee92400 session 0x55cf1e2d7880
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 ms_handle_reset con 0x55cf1ee92800 session 0x55cf1e2416c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d878800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:51.713392+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:52.713630+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:53.713761+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:54.713945+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:55.714127+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:56.714279+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:57.714525+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:58.714773+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:59.714891+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:00.715057+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:01.715185+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:02.715351+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:03.715498+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:04.715623+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:05.715749+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:06.715974+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:07.716166+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:08.716625+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:09.716881+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:10.717239+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:11.717395+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:12.717550+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:13.717719+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:14.717879+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:15.718054+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:16.718168+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:17.718348+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:18.718605+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:19.718807+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:20.718976+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:21.719180+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:22.719474+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:23.719682+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:24.719841+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:25.719989+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:26.720166+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:27.720313+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:28.720522+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:29.720745+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:30.720906+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:31.721102+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:32.721292+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:33.721434+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:34.721657+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:35.721789+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:36.722055+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:37.722203+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:38.722401+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:39.722551+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:40.722836+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:41.722994+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:42.723132+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:43.723259+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:44.723642+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:45.723808+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:46.724000+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:47.724147+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:48.724338+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:49.724512+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:50.724678+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:51.725052+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:52.725254+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:53.725422+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:54.725595+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:55.725735+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:56.725834+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:57.726000+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:58.726185+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:59.726360+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:00.726494+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:01.726674+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:02.726812+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:03.726968+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:04.727101+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:05.727240+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:06.727394+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:07.727525+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:08.727705+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:09.727814+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:10.727910+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:11.728055+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:12.728187+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:13.728311+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:14.728451+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:15.728645+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:16.728789+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:17.728906+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:18.729046+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:19.729168+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:20.729400+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:21.729552+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:22.729713+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:23.729854+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:24.730061+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:25.760909+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:26.761028+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:27.761194+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:28.761404+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:29.761564+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:30.761738+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:31.761874+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:32.762080+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:33.762308+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:34.762537+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:35.762682+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:36.762911+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:37.763086+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:38.763276+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:39.763430+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:40.763649+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:41.763787+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:42.763926+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:43.764073+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:44.764239+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:45.764417+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:46.764643+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:47.764800+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:48.765003+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:49.765176+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:50.765398+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:51.765565+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:52.765774+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:53.765930+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:54.766703+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:55.766916+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:56.767155+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:57.767303+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:58.767520+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:59.767686+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:00.767876+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:01.768058+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:02.768289+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:03.768439+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:04.768635+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:05.768811+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:06.768988+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:07.769150+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:08.769441+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:09.769607+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:10.769771+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:11.769969+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:12.770135+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:13.770298+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:14.770472+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:15.770630+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:16.770761+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:17.770891+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:18.771103+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:19.771280+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:20.771502+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:21.771621+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:22.771729+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:23.771824+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:24.771917+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:25.772108+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:26.772272+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:27.772482+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:28.772657+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:29.772827+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:30.773023+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:31.773201+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:32.773357+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:33.773534+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:34.773711+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:35.773883+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:36.774043+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:37.774219+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:38.774380+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:39.774512+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:40.774725+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:41.774893+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:42.775086+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:43.775244+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:44.775391+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:45.775536+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:46.775811+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:47.775952+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:48.776196+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:49.776402+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:50.776620+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:51.776797+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 663552 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:52.776940+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 663552 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:53.777141+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 663552 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:54.777394+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 663552 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:55.777618+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:56.777869+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:57.778075+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:58.778265+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:59.778534+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:00.778726+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:01.778932+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:02.779078+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:03.779256+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:04.779480+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:05.779751+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:06.779990+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:07.780243+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:08.780482+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:09.780643+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:10.780854+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:11.781035+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:12.781460+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:13.781637+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:14.781813+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:15.782004+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:16.782189+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:17.782383+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:18.782554+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:19.782764+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:20.782929+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:21.783149+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:22.783293+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:23.783421+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:24.783607+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:25.783761+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:26.784042+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:27.784248+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:28.784463+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:29.784621+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:30.784838+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:31.785065+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:32.785275+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:33.785415+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:34.785696+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:35.785965+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:36.786178+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:37.786328+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:38.786495+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:39.786665+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:40.786886+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:41.787152+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:42.787467+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:43.787636+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:44.787818+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:45.788010+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:46.788168+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:47.788315+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:48.788478+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:49.788639+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:50.788800+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:51.788978+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:52.789143+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:53.789307+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:54.789486+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:55.789646+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:56.789807+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:57.789970+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:58.790170+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:59.790351+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:00.790517+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:01.790690+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:02.790884+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:03.791036+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:04.791178+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:05.791374+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:06.791535+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:07.792185+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:08.792376+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:09.792667+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:10.792832+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:11.793023+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:12.793403+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:13.793650+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:14.793827+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:15.793979+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:16.794185+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:17.794466+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:18.794706+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:19.794883+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:20.795098+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:21.795251+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:22.795453+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:23.795643+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:24.795905+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:25.796076+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:26.796269+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:27.796489+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:28.796668+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:29.796796+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:30.796959+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:31.797129+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:32.797272+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:33.797480+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:34.797679+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:35.797841+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:36.797992+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:37.798187+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:38.798389+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:39.798539+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:40.798790+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:41.799345+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:42.799620+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:43.799814+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:44.800000+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:45.800252+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:46.800458+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:47.800680+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:48.800943+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:49.801148+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:50.801278+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:51.801543+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:52.801776+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:53.801925+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:54.802107+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:55.803117+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:56.803882+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:57.804191+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:58.804507+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:59.805328+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:00.805950+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:01.806156+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:02.806507+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:03.806673+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:04.806973+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:05.807533+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:06.807802+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:07.808117+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:08.808371+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:09.808641+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:10.808886+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:11.809097+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:12.809286+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:13.809499+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:14.809694+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:15.809914+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:16.810082+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:17.810263+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:18.810474+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:19.810671+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:20.810867+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:21.811069+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:22.811259+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:23.811492+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:24.811654+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:25.811806+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:26.811920+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:27.812850+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:28.813253+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:29.815675+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:30.816894+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:31.817229+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:32.817915+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:33.818370+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:34.818734+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:35.819347+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:36.819672+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:37.819811+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:38.820114+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:39.820358+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:40.820532+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:41.820719+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:42.821032+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:43.821384+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:44.821666+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:45.821973+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:46.822131+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:47.822343+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:48.822626+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:49.822831+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:50.823025+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:51.823210+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:52.823351+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:53.823507+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:54.823661+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:55.823824+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:56.824004+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:57.824197+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:58.824418+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:59.824683+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:00.824928+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:01.825157+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:02.825345+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:03.825544+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:04.825877+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:05.826142+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:06.826284+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:07.826517+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:08.826747+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:09.827047+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:10.827280+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:11.827445+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:12.827805+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:13.828212+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:14.828389+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:15.828737+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:16.829181+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:17.829401+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:18.829635+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 1025.044555664s of 1025.062866211s, submitted: 10
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:19.829854+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 562450 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68706304 unmapped: 491520 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 63 heartbeat osd_stat(store_statfs(0x4fe0a7000/0x0/0x4ffc00000, data 0xb962c/0x123000, compress 0x0/0x0/0x0, omap 0xafb1, meta 0x1a2504f), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:20.830025+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 450560 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 65 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1fb84700
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:21.830208+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 69083136 unmapped: 16900096 heap: 85983232 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:22.830398+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 69099520 unmapped: 16883712 heap: 85983232 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 65 heartbeat osd_stat(store_statfs(0x4fd89b000/0x0/0x4ffc00000, data 0x8bc287/0x92b000, compress 0x0/0x0/0x0, omap 0xb4cd, meta 0x1a24b33), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:23.830604+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 69099520 unmapped: 16883712 heap: 85983232 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 65 heartbeat osd_stat(store_statfs(0x4fd89b000/0x0/0x4ffc00000, data 0x8bc287/0x92b000, compress 0x0/0x0/0x0, omap 0xb4cd, meta 0x1a24b33), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:24.830781+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 682603 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 66 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1fbcb6c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:25.830992+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:26.831190+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:27.831342+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:28.831648+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 66 heartbeat osd_stat(store_statfs(0x4fcc2a000/0x0/0x4ffc00000, data 0x152d8a3/0x15a0000, compress 0x0/0x0/0x0, omap 0xbaaf, meta 0x1a24551), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:29.831862+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 685815 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 66 heartbeat osd_stat(store_statfs(0x4fcc2a000/0x0/0x4ffc00000, data 0x152d8a3/0x15a0000, compress 0x0/0x0/0x0, omap 0xbaaf, meta 0x1a24551), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 66 heartbeat osd_stat(store_statfs(0x4fcc2a000/0x0/0x4ffc00000, data 0x152d8a3/0x15a0000, compress 0x0/0x0/0x0, omap 0xbaaf, meta 0x1a24551), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:30.832057+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 66 handle_osd_map epochs [66,67], i have 66, src has [1,67]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.130791664s of 11.549943924s, submitted: 54
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:31.832218+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:32.832411+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:33.832632+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:34.832813+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:35.832950+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:36.833147+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:37.833357+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:38.833628+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:39.833775+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:40.833921+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:41.834080+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:42.834168+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:43.834342+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:44.834486+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:45.834658+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:46.834781+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:47.835031+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:48.835229+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:49.835346+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:50.835710+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:51.835938+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:52.836126+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:53.836353+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:54.836636+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:55.836877+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:56.837144+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:57.837335+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:58.837554+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:59.837811+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:00.837978+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:01.838298+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.202196121s of 31.212696075s, submitted: 13
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:02.838454+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 67 handle_osd_map epochs [67,68], i have 67, src has [1,68]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 68 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1f791a40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 23699456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:03.838664+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 23699456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 68 heartbeat osd_stat(store_statfs(0x4fcc23000/0x0/0x4ffc00000, data 0x1530330/0x15a7000, compress 0x0/0x0/0x0, omap 0xc2b9, meta 0x1a23d47), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:04.838813+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 68 heartbeat osd_stat(store_statfs(0x4fcc23000/0x0/0x4ffc00000, data 0x1530330/0x15a7000, compress 0x0/0x0/0x0, omap 0xc2b9, meta 0x1a23d47), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 694334 data_alloc: 218103808 data_used: 677
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 23699456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:05.838934+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 22200320 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 68 heartbeat osd_stat(store_statfs(0x4fc424000/0x0/0x4ffc00000, data 0x1d30353/0x1da8000, compress 0x0/0x0/0x0, omap 0xc2b9, meta 0x1a23d47), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:06.839663+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 21954560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:07.839830+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 69 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1fbe4e00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 21938176 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1e773c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:08.839967+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 69 heartbeat osd_stat(store_statfs(0x4f9c1f000/0x0/0x4ffc00000, data 0x4531923/0x45ab000, compress 0x0/0x0/0x0, omap 0xc53b, meta 0x1a23ac5), peers [0,2] op hist [1])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 70 ms_handle_reset con 0x55cf1e773c00 session 0x55cf1e2b2000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 20766720 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:09.840081+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 71 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1e2408c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 713122 data_alloc: 218103808 data_used: 677
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 71 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1e704e00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 20643840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:10.840283+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 71 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1e704fc0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadb400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 71 ms_handle_reset con 0x55cf1fadb400 session 0x55cf1da9a8c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 20463616 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:11.840451+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadfc00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 20332544 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:12.840656+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 71 handle_osd_map epochs [72,72], i have 71, src has [1,72]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.184928894s of 10.620976448s, submitted: 112
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 72 ms_handle_reset con 0x55cf1fadfc00 session 0x55cf1f7916c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 20480000 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 72 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1dc25c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 72 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1dc24700
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:13.840823+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 72 heartbeat osd_stat(store_statfs(0x4fcc15000/0x0/0x4ffc00000, data 0x1535f4b/0x15b5000, compress 0x0/0x0/0x0, omap 0xd0d5, meta 0x1a22f2b), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadb400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 20480000 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdaa800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:14.841021+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 72 handle_osd_map epochs [73,73], i have 72, src has [1,73]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 73 ms_handle_reset con 0x55cf1fdaa800 session 0x55cf1e3c1880
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 73 ms_handle_reset con 0x55cf1fadb400 session 0x55cf1d5af180
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 723385 data_alloc: 218103808 data_used: 677
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 20226048 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:15.841178+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 73 heartbeat osd_stat(store_statfs(0x4fcc13000/0x0/0x4ffc00000, data 0x1537959/0x15b7000, compress 0x0/0x0/0x0, omap 0xd866, meta 0x1a2279a), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 74 handle_osd_map epochs [74,75], i have 74, src has [1,75]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 75 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1fb44fc0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 75 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1ee96a80
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 75 heartbeat osd_stat(store_statfs(0x4fcc13000/0x0/0x4ffc00000, data 0x1537959/0x15b7000, compress 0x0/0x0/0x0, omap 0xd866, meta 0x1a2279a), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 19742720 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:16.841378+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 19693568 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:17.841644+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 19693568 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:18.841915+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 19693568 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:19.842163+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 75 heartbeat osd_stat(store_statfs(0x4fcc11000/0x0/0x4ffc00000, data 0x1539c56/0x15bb000, compress 0x0/0x0/0x0, omap 0xdf34, meta 0x1a220cc), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 725390 data_alloc: 218103808 data_used: 677
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74752000 unmapped: 19628032 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:20.842309+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 76 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1e2d6700
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74760192 unmapped: 19619840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 76 heartbeat osd_stat(store_statfs(0x4fba70000/0x0/0x4ffc00000, data 0x1539c79/0x15bc000, compress 0x0/0x0/0x0, omap 0xdf34, meta 0x2bc20cc), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:21.842461+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 76 heartbeat osd_stat(store_statfs(0x4fba6b000/0x0/0x4ffc00000, data 0x153b262/0x15bf000, compress 0x0/0x0/0x0, omap 0xe1c1, meta 0x2bc1e3f), peers [0,2] op hist [0,0,0,0,0,0,1])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74776576 unmapped: 19603456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:22.842639+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 76 handle_osd_map epochs [76,77], i have 76, src has [1,77]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.756548882s of 10.034585953s, submitted: 104
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 77 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1fb85500
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74776576 unmapped: 19603456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd4c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:23.842715+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 78 ms_handle_reset con 0x55cf1fdd4c00 session 0x55cf1fb44540
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74809344 unmapped: 19570688 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd4800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:24.842889+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 79 ms_handle_reset con 0x55cf1fdd4800 session 0x55cf1e2b3a40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 747169 data_alloc: 218103808 data_used: 677
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74817536 unmapped: 19562496 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:25.843012+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 80 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1d865880
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74817536 unmapped: 19562496 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 81 heartbeat osd_stat(store_statfs(0x4fba60000/0x0/0x4ffc00000, data 0x153fa30/0x15cc000, compress 0x0/0x0/0x0, omap 0xeb96, meta 0x2bc146a), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:26.843186+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 82 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1ee97a40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 82 heartbeat osd_stat(store_statfs(0x4fba54000/0x0/0x4ffc00000, data 0x15424fc/0x15d4000, compress 0x0/0x0/0x0, omap 0xf4d9, meta 0x2bc0b27), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 18595840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:27.843372+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 18595840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:28.843562+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 82 heartbeat osd_stat(store_statfs(0x4fba4f000/0x0/0x4ffc00000, data 0x1543ac9/0x15d7000, compress 0x0/0x0/0x0, omap 0xf87b, meta 0x2bc0785), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 18595840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:29.843745+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 761402 data_alloc: 218103808 data_used: 677
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 18685952 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:30.843912+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 83 heartbeat osd_stat(store_statfs(0x4fba55000/0x0/0x4ffc00000, data 0x1543ac9/0x15d7000, compress 0x0/0x0/0x0, omap 0xfaa1, meta 0x2bc055f), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 83 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1e75f6c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 18546688 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:31.844069+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 84 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1e2b3a40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75997184 unmapped: 18382848 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1e773000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:32.844225+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.886656761s of 10.113715172s, submitted: 149
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 85 ms_handle_reset con 0x55cf1e773000 session 0x55cf1fbcac40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76029952 unmapped: 18350080 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 85 heartbeat osd_stat(store_statfs(0x4fba4f000/0x0/0x4ffc00000, data 0x15466d8/0x15db000, compress 0x0/0x0/0x0, omap 0x105a5, meta 0x2bbfa5b), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:33.844406+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 86 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1dc24c40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76111872 unmapped: 18268160 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:34.844555+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 770245 data_alloc: 218103808 data_used: 12860
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 87 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1f791a40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 18145280 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:35.845187+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 88 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1ee96e00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fba49000/0x0/0x4ffc00000, data 0x154a328/0x15e0000, compress 0x0/0x0/0x0, omap 0x10d34, meta 0x2bbf2cc), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 18096128 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:36.845327+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 18096128 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:37.845548+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fba44000/0x0/0x4ffc00000, data 0x154b926/0x15e2000, compress 0x0/0x0/0x0, omap 0x10fef, meta 0x2bbf011), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 18079744 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:38.845886+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 18079744 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:39.846105+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 772529 data_alloc: 218103808 data_used: 12860
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 18079744 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:40.846317+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 88 handle_osd_map epochs [88,89], i have 88, src has [1,89]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:41.846490+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 89 heartbeat osd_stat(store_statfs(0x4fba45000/0x0/0x4ffc00000, data 0x154cdf2/0x15e5000, compress 0x0/0x0/0x0, omap 0x1120b, meta 0x2bbedf5), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:42.846663+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:43.846856+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:44.847049+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 774677 data_alloc: 218103808 data_used: 20982
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 18194432 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:45.847231+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.887969017s of 13.055091858s, submitted: 117
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 90 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1ee97340
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:46.847425+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:47.847631+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x154e41a/0x15e9000, compress 0x0/0x0/0x0, omap 0x11621, meta 0x2bbe9df), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:48.847808+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:49.847989+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 779189 data_alloc: 218103808 data_used: 20982
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:50.848147+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd4c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 90 ms_handle_reset con 0x55cf1fdd4c00 session 0x55cf1fb85dc0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76324864 unmapped: 18055168 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:51.848258+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 92 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1fb841c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 92 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1e2b2000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 92 ms_handle_reset con 0x55cf1fdd5000 session 0x55cf1fbb0fc0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:52.848418+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 92 heartbeat osd_stat(store_statfs(0x4fba37000/0x0/0x4ffc00000, data 0x155105d/0x15f1000, compress 0x0/0x0/0x0, omap 0x11ad7, meta 0x2bbe529), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:53.848644+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:54.848846+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 790116 data_alloc: 218103808 data_used: 21017
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:55.849024+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:56.849199+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:57.849325+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 92 heartbeat osd_stat(store_statfs(0x4fba37000/0x0/0x4ffc00000, data 0x155105d/0x15f1000, compress 0x0/0x0/0x0, omap 0x11ad7, meta 0x2bbe529), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:58.849517+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:59.849677+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 790116 data_alloc: 218103808 data_used: 21017
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:00.849862+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.561586380s of 15.613102913s, submitted: 36
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:01.850069+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 17956864 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:02.850256+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 93 heartbeat osd_stat(store_statfs(0x4fba36000/0x0/0x4ffc00000, data 0x155250d/0x15f4000, compress 0x0/0x0/0x0, omap 0x11d8c, meta 0x2bbe274), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 17956864 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:03.850453+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 93 handle_osd_map epochs [95,95], i have 93, src has [1,95]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 93 handle_osd_map epochs [94,95], i have 93, src has [1,95]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 95 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1e2d61c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76480512 unmapped: 17899520 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:04.850623+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf2001c000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 800233 data_alloc: 218103808 data_used: 21017
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76496896 unmapped: 17883136 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:05.850801+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 95 heartbeat osd_stat(store_statfs(0x4fba2e000/0x0/0x4ffc00000, data 0x15550fb/0x15fa000, compress 0x0/0x0/0x0, omap 0x121e2, meta 0x2bbde1e), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76496896 unmapped: 17883136 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:06.850978+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76496896 unmapped: 17883136 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:07.851130+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf2001a800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 96 ms_handle_reset con 0x55cf2001a800 session 0x55cf1ee96700
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 16654336 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:08.851823+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 97 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1e2408c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 16572416 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:09.852346+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 97 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1ee97dc0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 810508 data_alloc: 218103808 data_used: 21083
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 16572416 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 97 ms_handle_reset con 0x55cf1fdd5000 session 0x55cf1fbca1c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:10.852772+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 98 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1fb45880
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 98 heartbeat osd_stat(store_statfs(0x4fba26000/0x0/0x4ffc00000, data 0x1557cfb/0x1602000, compress 0x0/0x0/0x0, omap 0x12956, meta 0x2bbd6aa), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 16654336 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:11.853116+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf2001ac00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.186281204s of 10.288720131s, submitted: 73
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 99 ms_handle_reset con 0x55cf2001ac00 session 0x55cf1d5ae1c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 16564224 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:12.853386+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1fb85880
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 16547840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:13.853643+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 16547840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:14.853774+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 heartbeat osd_stat(store_statfs(0x4fba1c000/0x0/0x4ffc00000, data 0x155bf8b/0x160a000, compress 0x0/0x0/0x0, omap 0x13542, meta 0x2bbcabe), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 818980 data_alloc: 218103808 data_used: 21083
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 16547840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:15.854919+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77856768 unmapped: 16523264 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:16.855101+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1fb856c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1e2d7500
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1fdd5000 session 0x55cf1fb84380
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1f7a6800
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1f7a6800 session 0x55cf1fb85180
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1fb84000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadfc00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1fadfc00 session 0x55cf1fb84540
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1f7fd400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1f7fd400 session 0x55cf1ee976c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77873152 unmapped: 16506880 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:17.855231+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1e75e540
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77873152 unmapped: 16506880 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:18.855437+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1e273000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1e273000 session 0x55cf1fb5a000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1fb45180
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1fb9a1c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 16203776 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:19.855648+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1f7fd400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823271 data_alloc: 218103808 data_used: 21083
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 heartbeat osd_stat(store_statfs(0x4fb9fc000/0x0/0x4ffc00000, data 0x157ffdd/0x1630000, compress 0x0/0x0/0x0, omap 0x13542, meta 0x2bbcabe), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 16203776 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:20.855793+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 16203776 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:21.856075+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 16203776 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:22.856444+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadfc00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 101 ms_handle_reset con 0x55cf1fadfc00 session 0x55cf1dc24380
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.026945114s of 11.134009361s, submitted: 77
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 102 ms_handle_reset con 0x55cf1faa4000 session 0x55cf1f848380
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdaa000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 102 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1e3c0c40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 102 ms_handle_reset con 0x55cf1fdaa000 session 0x55cf1dc256c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78766080 unmapped: 15613952 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 102 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1d5aea80
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:23.856624+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 102 handle_osd_map epochs [102,103], i have 102, src has [1,103]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 103 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1fbca700
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 15605760 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:24.856791+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 103 heartbeat osd_stat(store_statfs(0x4fb9ed000/0x0/0x4ffc00000, data 0x15840c3/0x163a000, compress 0x0/0x0/0x0, omap 0x13c4f, meta 0x2bbc3b1), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835465 data_alloc: 218103808 data_used: 23197
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1ee97c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 15605760 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:25.856940+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1faa4000 session 0x55cf1e75fdc0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1faa4000 session 0x55cf1f790c40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1da9b880
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 15605760 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:26.857212+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 104 heartbeat osd_stat(store_statfs(0x4fb9e8000/0x0/0x4ffc00000, data 0x15856ac/0x163d000, compress 0x0/0x0/0x0, omap 0x13f40, meta 0x2bbc0c0), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1e2d7340
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78790656 unmapped: 15589376 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:27.857356+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 104 handle_osd_map epochs [104,105], i have 105, src has [1,105]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 15556608 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 105 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1da9b180
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:28.857549+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 15556608 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:29.857777+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 105 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1e705500
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 105 ms_handle_reset con 0x55cf1f7fd400 session 0x55cf1d5af6c0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839964 data_alloc: 218103808 data_used: 24430
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78708736 unmapped: 15671296 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 106 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1fb44540
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:30.857966+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78716928 unmapped: 15663104 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:31.858126+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 106 heartbeat osd_stat(store_statfs(0x4fba0e000/0x0/0x4ffc00000, data 0x15642c4/0x161c000, compress 0x0/0x0/0x0, omap 0x146e3, meta 0x2bbb91d), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78716928 unmapped: 15663104 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:32.858300+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 107 heartbeat osd_stat(store_statfs(0x4fba0b000/0x0/0x4ffc00000, data 0x1565790/0x161f000, compress 0x0/0x0/0x0, omap 0x149b4, meta 0x2bbb64c), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78716928 unmapped: 15663104 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:33.858468+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.730767250s of 10.873538971s, submitted: 112
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 107 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1e2b2540
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 107 ms_handle_reset con 0x55cf2001c000 session 0x55cf1ee96c40
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadfc00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 107 ms_handle_reset con 0x55cf1fadfc00 session 0x55cf1fbb1500
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:34.858685+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842323 data_alloc: 218103808 data_used: 22894
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 107 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1d5afdc0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdaa400
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:35.858837+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 108 ms_handle_reset con 0x55cf1fdaa400 session 0x55cf1d5ae700
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:36.858999+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:37.859166+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 108 heartbeat osd_stat(store_statfs(0x4fba0a000/0x0/0x4ffc00000, data 0x1566d94/0x1620000, compress 0x0/0x0/0x0, omap 0x1509e, meta 0x2bbaf62), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:38.859357+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:39.859531+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843615 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:40.859742+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 108 heartbeat osd_stat(store_statfs(0x4fba0a000/0x0/0x4ffc00000, data 0x1566d94/0x1620000, compress 0x0/0x0/0x0, omap 0x1509e, meta 0x2bbaf62), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:41.859940+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:42.860171+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:43.860369+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:44.860560+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846389 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:45.860770+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 109 heartbeat osd_stat(store_statfs(0x4fba07000/0x0/0x4ffc00000, data 0x1568260/0x1623000, compress 0x0/0x0/0x0, omap 0x152b5, meta 0x2bbad4b), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 109 handle_osd_map epochs [109,110], i have 110, src has [1,110]
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.690909386s of 12.290042877s, submitted: 67
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:46.860992+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:47.861173+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:48.861407+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:49.861692+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:50.861859+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:51.862010+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:52.862195+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:53.862327+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:54.862495+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:55.862622+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:56.862816+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:57.862922+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:58.863129+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:59.863290+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:00.863475+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:01.863664+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:02.863863+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:03.864069+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:04.864265+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:05.864440+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:06.864666+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:07.864831+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:08.865030+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:09.865255+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:10.865424+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:11.865587+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:12.865741+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:13.866776+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:14.866918+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:15.867105+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:16.867297+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:17.867461+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:18.867752+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:19.867991+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:20.868167+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:21.868333+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:22.868531+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:23.868698+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:24.868848+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:25.869043+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:26.869377+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:27.869550+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:28.869754+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:29.869938+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:30.870114+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:31.870284+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:32.870501+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:33.870647+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:34.870802+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:35.870969+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:36.871101+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:37.871241+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:38.871418+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:39.871621+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:40.871830+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:41.872065+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:42.872309+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:43.872494+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:44.872666+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:45.872849+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:46.872998+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:47.873151+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:48.873337+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:49.873468+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread fragmentation_score=0.000123 took=0.000012s
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:50.873629+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:51.873750+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:52.873913+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:53.874076+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:54.874255+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:55.874416+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:56.874614+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:57.874759+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:42 compute-0 ceph-mon[75204]: from='client.14780 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:42 compute-0 ceph-mon[75204]: from='client.14784 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:42 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1602125538' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 03 21:34:42 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2487170743' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:58.874987+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:59.875143+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:00.875281+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:01.875403+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:02.875546+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:03.875635+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:04.875792+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:05.875949+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:06.878408+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:07.878534+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:08.878718+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78635008 unmapped: 15745024 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:09.878852+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'config show' '{prefix=config show}'
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 15548416 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:10.878996+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:42 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:42 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 15343616 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:34:42 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:11.879141+0000)
Dec 03 21:34:42 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 15343616 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:42 compute-0 ceph-osd[87094]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:34:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 03 21:34:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3308223750' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 03 21:34:42 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:34:43 compute-0 rsyslogd[1006]: imjournal from <np0005544708:ceph-osd>: begin to drop messages due to rate-limiting
Dec 03 21:34:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 03 21:34:43 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3652548116' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 03 21:34:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 03 21:34:43 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2628822250' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 03 21:34:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 03 21:34:43 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1896228232' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 03 21:34:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 03 21:34:43 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3063975184' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 03 21:34:43 compute-0 ceph-mon[75204]: pgmap v891: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:43 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3308223750' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 03 21:34:43 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3652548116' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 03 21:34:43 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2628822250' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 03 21:34:43 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1896228232' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 03 21:34:43 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3063975184' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 03 21:34:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec 03 21:34:44 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3882028722' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 03 21:34:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 03 21:34:44 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/73435469' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 03 21:34:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v892: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 03 21:34:44 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1256326152' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 03 21:34:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 03 21:34:44 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3304457426' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 03 21:34:44 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3882028722' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 03 21:34:44 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/73435469' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 03 21:34:44 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1256326152' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 03 21:34:44 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3304457426' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 03 21:34:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 03 21:34:45 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3136263609' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 03 21:34:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 03 21:34:45 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1985349043' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 03 21:34:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 03 21:34:45 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3652806554' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 03 21:34:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 03 21:34:45 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2364751280' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 03 21:34:45 compute-0 ceph-mon[75204]: pgmap v892: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:45 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3136263609' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 03 21:34:45 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1985349043' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 03 21:34:45 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3652806554' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 03 21:34:45 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2364751280' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 03 21:34:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 03 21:34:46 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1018667953' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 03 21:34:46 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14818 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v893: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:46 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14820 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:46 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14822 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:46 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1018667953' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 03 21:34:46 compute-0 ceph-mon[75204]: from='client.14818 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:47 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14826 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:47 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14824 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888262749s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578292847s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.100525 8 0.000116
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.111871 11 0.000102
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.110261 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888237953s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578292847s@ mbc={}] exit Reset 0.000050 1 0.000096
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899343491s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589401245s@ mbc={}] exit Reset 0.000292 1 0.000330
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888237953s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578292847s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.110349 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899343491s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589401245s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888237953s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578292847s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899343491s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589401245s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888237953s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578292847s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899343491s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589401245s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888237953s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578292847s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899343491s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589401245s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.111890 11 0.000097
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 9.110396 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.888237953s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578292847s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122515 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899343491s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589401245s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.756535 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.756654 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899204254s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589408875s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.100710 8 0.000105
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.110588 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.110683 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887511253s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577743530s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 9.110725 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887497902s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577743530s@ mbc={}] exit Reset 0.000034 1 0.000153
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899168015s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589408875s@ mbc={}] exit Reset 0.000143 1 0.000205
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887497902s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577743530s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887497902s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577743530s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887497902s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577743530s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899168015s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589408875s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887497902s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577743530s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887497902s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577743530s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899168015s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589408875s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899168015s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589408875s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899168015s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589408875s@ mbc={}] exit Start 0.000014 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899053574s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589332581s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899168015s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589408875s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899025917s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589332581s@ mbc={}] exit Reset 0.000059 1 0.000092
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899025917s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589332581s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899025917s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589332581s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.100712 8 0.000055
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.110441 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899025917s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589332581s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.110569 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899025917s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589332581s@ mbc={}] exit Start 0.000035 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 9.110612 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899025917s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589332581s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899174690s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589561462s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899160385s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589561462s@ mbc={}] exit Reset 0.000028 1 0.000069
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899160385s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589561462s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899160385s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589561462s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112164 11 0.000064
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899160385s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589561462s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899160385s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589561462s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122755 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.899160385s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589561462s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.754920 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.754959 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112221 11 0.000076
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122793 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.757399 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.757431 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887281418s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577758789s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887357712s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577850342s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122548 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.756764 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887332916s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577850342s@ mbc={}] exit Reset 0.000049 1 0.000081
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.756815 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887332916s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577850342s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887332916s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577850342s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887244225s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577758789s@ mbc={}] exit Reset 0.000093 1 0.000115
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887332916s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577850342s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887207985s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.577751160s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887244225s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577758789s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887332916s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577850342s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887244225s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577758789s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887332916s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577850342s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887244225s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577758789s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887195587s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577751160s@ mbc={}] exit Reset 0.000030 1 0.000516
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887244225s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577758789s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887244225s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577758789s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887195587s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577751160s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887195587s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577751160s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887195587s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577751160s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887195587s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577751160s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887195587s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.577751160s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.100998 8 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.110592 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.110689 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 9.110721 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112374 11 0.000083
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122899 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.756708 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898865700s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589576721s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.756736 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898849487s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] exit Reset 0.000029 1 0.000050
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898849487s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898849487s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898849487s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898849487s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898849487s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887292862s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578041077s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887272835s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578041077s@ mbc={}] exit Reset 0.000042 1 0.000076
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887272835s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578041077s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887272835s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578041077s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887272835s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578041077s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887272835s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578041077s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887272835s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578041077s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112481 11 0.000153
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.123005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.755863 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.755886 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887163162s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578002930s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 9.101192 8 0.000068
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 9.110658 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887149811s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578002930s@ mbc={}] exit Reset 0.000027 1 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 9.110775 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887149811s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578002930s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887149811s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578002930s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887149811s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578002930s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 9.110810 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887149811s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578002930s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887149811s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578002930s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898716927s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 active pruub 102.589614868s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112520 11 0.000069
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122365 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.756037 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.756069 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898696899s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589614868s@ mbc={}] exit Reset 0.000046 1 0.000071
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898696899s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589614868s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898696899s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589614868s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887128830s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578063965s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898696899s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589614868s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898696899s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589614868s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887117386s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578063965s@ mbc={}] exit Reset 0.000025 1 0.000043
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887117386s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578063965s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41 pruub=14.898696899s) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589614868s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887117386s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578063965s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887117386s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578063965s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887117386s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578063965s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887117386s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578063965s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112543 11 0.000056
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122343 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.755711 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.755744 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112518 11 0.000062
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122135 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.756002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.756028 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887083054s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578102112s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887088776s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578125000s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112540 11 0.000056
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122189 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887045860s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578102112s@ mbc={}] exit Reset 0.000066 1 0.000100
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112542 11 0.000053
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122139 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.755942 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887045860s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578102112s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887045860s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578102112s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887045860s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578102112s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.755972 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887045860s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578102112s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887045860s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578102112s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887156487s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578285217s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887145042s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] exit Reset 0.000023 1 0.000048
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887145042s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887145042s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887145042s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887145042s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.887145042s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112757 11 0.000054
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122263 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.756080 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.756118 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886906624s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578125000s@ mbc={}] exit Reset 0.000249 1 0.000198
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886906624s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578125000s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886906624s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578125000s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886906624s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578125000s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886906624s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578125000s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886906624s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578125000s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886973381s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578285217s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886903763s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] exit Reset 0.000135 1 0.000178
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 10.112807 11 0.000056
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 10.122319 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.757213 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary 10.756050 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.757247 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886903763s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886903763s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] exit Started 10.756140 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886903763s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=35) [0] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886874199s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578308105s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886903763s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886903763s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578285217s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886862755s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578308105s@ mbc={}] exit Reset 0.000030 1 0.000050
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886862755s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578308105s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886862755s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578308105s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886862755s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578308105s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886862755s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578308105s@ mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886862755s) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578308105s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886787415s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 active pruub 101.578254700s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886762619s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578254700s@ mbc={}] exit Reset 0.000057 1 0.000450
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886762619s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578254700s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886762619s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578254700s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886762619s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578254700s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886762619s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578254700s@ mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41 pruub=13.886762619s) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY pruub 101.578254700s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 41 handle_osd_map epochs [41,41], i have 41, src has [1,41]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 41 handle_osd_map epochs [41,41], i have 41, src has [1,41]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000056 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000013
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000054 1 0.000028
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000008
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000449 1 0.000233
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000034 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000009
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000041 1 0.000033
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000036 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000005 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000031 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000027 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000006
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000031 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000026 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000027 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000029 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000008 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000026 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000041 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000042 1 0.000023
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000033 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000036 1 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000031 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000043 1 0.000023
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000050 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000037 1 0.000035
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000030 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000026 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000030 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000034 1 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000041 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000016 1 0.000019
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000032 1 0.000023
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000026 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000039 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000051 1 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000026 1 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000059 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000037 1 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000107 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000027 1 0.000050
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000120 1 0.000070
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000054 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000019
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000083 1 0.000041
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000049 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000018
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000076 1 0.000062
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000054 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000074 1 0.000063
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000046 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000010
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000060 1 0.000027
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000028 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=0 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000028 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=0/0 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000027 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000012
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000031 1 0.000023
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000025 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000107 1 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000035 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000007
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000112 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000028 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000004 1 0.000013
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000042 1 0.000068
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000100 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000027 1 0.000035
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000118 1 0.000039
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000058 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000014
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000078 1 0.000033
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000071 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000024
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000069 1 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000091 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000011 1 0.000026
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000085 1 0.000060
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000095 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000024
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000076 1 0.000052
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000061 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000012 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000050 1 0.000046
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000050 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000044 1 0.000034
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000049 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000070 1 0.000035
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000059 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=0 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000012 1 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000043 1 0.000033
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=0/0 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000194 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000010 1 0.000026
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000064 1 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000057 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000007 1 0.000017
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000008 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000064 1 0.000034
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000054 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=0 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000006 1 0.000010
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000049 1 0.000025
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=0/0 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000043 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=0 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000013
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000036 1 0.000024
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=0/0 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010454 2 0.000018
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010310 2 0.000015
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010268 2 0.000017
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010011 2 0.000017
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009969 2 0.000031
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009866 2 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009763 2 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011003 2 0.000015
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009788 2 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008816 2 0.000040
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008144 2 0.000038
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008735 2 0.000047
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008422 2 0.000034
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008012 2 0.000019
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000093 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008035 2 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005689 2 0.000035
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005408 2 0.000033
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005627 2 0.000033
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005202 2 0.000032
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.005049 2 0.000031
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000134 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014337 2 0.000015
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014184 2 0.000028
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015864 2 0.000039
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000056 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.015377 2 0.000037
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013064 2 0.000015
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000027 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013306 2 0.000017
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.013146 2 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011435 2 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000029 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011446 2 0.000020
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011061 2 0.000020
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.011263 2 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000045 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000046 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014011 2 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010593 2 0.000050
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000019 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.009896 2 0.000041
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010322 2 0.000049
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008442 2 0.000044
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.014930 2 0.000016
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007943 2 0.000018
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.008345 2 0.000072
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000002 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000028 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000027 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.010972 2 0.000038
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.007886 2 0.000017
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 41 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:07.969852+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63012864 unmapped: 942080 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 41 handle_osd_map epochs [41,42], i have 41, src has [1,42]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 41 handle_osd_map epochs [41,42], i have 42, src has [1,42]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.982087 2 0.000032
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.982228 2 0.000085
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.990180 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987462 2 0.000051
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997417 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.983740 2 0.000236
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987887 2 0.000028
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997958 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.998116 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987860 2 0.000020
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997898 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987644 2 0.000020
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997465 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984162 2 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.998553 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.990946 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984388 2 0.000054
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984568 2 0.000025
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.000525 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.000278 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.983321 2 0.000036
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.994414 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987648 2 0.000025
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.993574 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.987619 2 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.983861 2 0.000037
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.994301 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984835 2 0.000042
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997984 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.993478 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984498 2 0.000098
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.995704 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.984228 2 0.000232
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989647 2 0.000035
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.995743 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.988102 2 0.000025
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.993559 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.000352 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985134 2 0.000091
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.996553 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.988468 2 0.000082
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997052 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989026 2 0.000279
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989194 2 0.000030
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.998054 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997793 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989514 2 0.000019
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999358 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.988683 2 0.000198
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.994456 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985735 2 0.000037
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999264 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989670 2 0.000088
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.988353 2 0.000039
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.998072 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997859 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985753 2 0.000040
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999819 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985365 2 0.000028
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.993831 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.989403 2 0.000153
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985928 2 0.000206
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997603 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 42 handle_osd_map epochs [42,42], i have 42, src has [1,42]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.990229 2 0.000020
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999221 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986530 2 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.999908 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986318 2 0.000166
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.997893 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985827 2 0.000023
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.994378 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.986057 2 0.000024
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.996089 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=39/40 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.990690 2 0.000015
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.001746 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.985568 2 0.000312
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.000861 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=35/36 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.991578 2 0.000052
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.995388 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.002119 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=37/38 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.991598 2 0.000025
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.018381 7 0.000059
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.002009 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=33/34 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004121 3 0.000250
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000008 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004007 3 0.000071
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.13( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.003982 3 0.000039
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.14( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004080 3 0.000093
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.12( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.004026 3 0.000050
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.15( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.020279 7 0.000076
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.020544 7 0.000048
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.005370 3 0.000162
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000017 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.11( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.021546 7 0.000084
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.023436 7 0.000116
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.022370 7 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 42 handle_osd_map epochs [42,42], i have 42, src has [1,42]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011170 4 0.000077
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.15( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010755 4 0.000091
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000032 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.17( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011865 4 0.001562
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000029 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1b( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011656 4 0.000100
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.9( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011611 4 0.000198
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011546 4 0.000046
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011348 4 0.000329
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.b( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.8( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.a( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011205 4 0.000056
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.6( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010829 4 0.000690
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.3( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010807 4 0.000399
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011585 4 0.000097
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010824 4 0.000454
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010791 4 0.000069
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.3( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011975 4 0.000721
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.13( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.16( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011009 4 0.000053
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.f( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011353 4 0.000114
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.5( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=39/39 les/c/f=40/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=35/35 les/c/f=36/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=37/37 les/c/f=38/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=33/33 les/c/f=34/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011523 4 0.000162
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011056 4 0.000395
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000014 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.6( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.2( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011164 4 0.000446
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1c( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011034 4 0.001455
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011045 4 0.000057
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.3( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.9( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011066 4 0.000236
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000010 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010998 4 0.000047
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.4( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.18( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011013 4 0.000039
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010930 4 0.000034
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.c( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010956 4 0.000102
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010837 4 0.000112
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.7( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.4( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010863 4 0.000073
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1b( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010808 4 0.000078
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.011898 4 0.000955
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.2( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[7.1f( empty local-lis/les=41/42 n=0 ec=39/23 lis/c=41/39 les/c/f=42/40/0 sis=41) [0] r=0 lpr=41 pi=[39,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000013 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010836 4 0.000204
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.1( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[3.f( empty local-lis/les=41/42 n=0 ec=35/17 lis/c=41/35 les/c/f=42/36/0 sis=41) [0] r=0 lpr=41 pi=[35,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010792 4 0.000060
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010759 4 0.000194
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000003 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000011 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.18( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[5.1e( empty local-lis/les=41/42 n=0 ec=37/19 lis/c=41/37 les/c/f=42/38/0 sis=41) [0] r=0 lpr=41 pi=[37,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010815 4 0.000554
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.1d( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.010807 4 0.000127
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000012 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[2.19( empty local-lis/les=41/42 n=0 ec=33/16 lis/c=41/33 les/c/f=42/34/0 sis=41) [0] r=0 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.029932 7 0.000263
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000086 1 0.000033
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.d( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.032966 7 0.000059
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000119 1 0.000216
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.033959 7 0.000275
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.033044 7 0.000090
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.033028 7 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000181 1 0.000065
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.034077 7 0.000077
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.033681 7 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000146 1 0.000057
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.14( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.035528 7 0.000044
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.035219 7 0.000052
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.034152 7 0.000045
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.033630 7 0.000086
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000203 1 0.000087
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.12( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000178 1 0.000027
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.2( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000234 1 0.000022
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.f( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.034667 7 0.000043
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000224 1 0.000031
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000243 1 0.000013
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.9( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000264 1 0.000014
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.10( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000431 1 0.000183
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.8( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000301 1 0.000184
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.4( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.036477 7 0.000056
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.036784 7 0.000050
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.035077 7 0.000121
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.035271 7 0.000114
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.037622 7 0.000051
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.035737 7 0.000226
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.036060 7 0.000071
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000482 1 0.000046
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1b( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000520 1 0.000017
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.a( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000608 1 0.000015
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.18( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000664 1 0.000026
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.11( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000714 1 0.000014
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1c( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000759 1 0.000021
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.e( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000863 1 0.000018
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.042594 7 0.000060
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.042720 7 0.000061
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000379 1 0.000061
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.5( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000357 1 0.000036
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.7( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.042700 7 0.000033
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000107 1 0.000702
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.13( empty local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.d( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.063971 1 0.000051
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.d( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.064102 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.d( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.094066 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.077455 2 0.000063
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.077480 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000061 1 0.000067
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1a( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.068630 1 0.000069
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1a( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.068823 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1a( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.101955 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.074823 1 0.000042
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.075034 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.1( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started 1.109089 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.14( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.082293 1 0.000073
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.14( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.082481 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.14( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.115610 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.12( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.089876 1 0.000026
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.12( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.090124 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.12( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.123195 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.2( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.096947 1 0.000018
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.2( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.097157 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.2( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.131272 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.f( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.104224 1 0.000019
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.f( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.104499 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.f( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.138204 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.111509 1 0.000024
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.111771 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.9( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started 1.147022 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.9( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.118978 1 0.000014
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.9( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.119252 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.9( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.153434 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.10( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.126210 1 0.000064
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.10( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.126508 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.10( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.160201 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.8( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.133906 1 0.000073
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.8( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.134389 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.8( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.169947 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.4( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.140948 1 0.000030
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.4( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.141292 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.4( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.175988 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1b( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.146489 1 0.000053
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1b( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.147001 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1b( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.183512 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.a( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.153800 1 0.000084
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.a( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.154376 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.a( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.191195 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.18( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.161105 1 0.000043
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.18( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.161743 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.18( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.196841 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.11( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.168428 1 0.000051
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.11( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.169137 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.11( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.204429 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1c( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.176218 1 0.000092
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1c( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.176980 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1c( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.214635 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.e( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.183365 1 0.000104
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.e( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.184192 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.e( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.219965 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.190484 1 0.000020
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.191389 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.1( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.227485 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.5( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.192329 1 0.000063
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.5( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.192737 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.5( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.235381 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.7( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.199747 1 0.000049
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.7( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.200145 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.7( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [1] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.242908 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.13( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 DELETING pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.205425 1 0.000058
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.13( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.205919 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[4.13( empty lb MIN local-lis/les=35/37 n=0 ec=35/18 lis/c=35/35 les/c/f=37/37/0 sis=41) [2] r=-1 lpr=41 pi=[35,41)/1 crt=0'0 unknown NOTIFY mbc={}] exit Started 1.248958 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.299565 2 0.000042
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.299598 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000071 1 0.000065
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:08.970021+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.294233 2 0.000269
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.294342 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.390276 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.145011 2 0.000118
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.145173 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.465122 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.514892 2 0.000036
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.514964 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000203 1 0.000173
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.572682 2 0.000473
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.572765 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000162 1 0.000105
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.136237 2 0.000303
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.136682 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.672251 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.165825 2 0.000176
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.166067 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.7( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.760833 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.747952 2 0.000047
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.748033 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000173 1 0.000217
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63332352 unmapped: 622592 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.861928 2 0.000028
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.861984 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000167 1 0.000103
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.118051 2 0.000247
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.118321 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.5( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.889884 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 DELETING pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.024713 2 0.000163
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.024950 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 42 pg[6.3( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=41) [1] r=-1 lpr=41 pi=[37,41)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.909346 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 42 heartbeat osd_stat(store_statfs(0x4fe155000/0x0/0x4ffc00000, data 0x2ce29/0x73000, compress 0x0/0x0/0x0, omap 0x7750, meta 0x1a288b0), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 42 handle_osd_map epochs [43,43], i have 42, src has [1,43]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 42 handle_osd_map epochs [43,43], i have 43, src has [1,43]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.240471 14 0.000091
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.250118 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.250217 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 11.250262 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.240549 14 0.000078
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.250235 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.250335 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 11.250377 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759346962s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 active pruub 102.589576721s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759297371s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] exit Reset 0.000102 1 0.000169
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759297371s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759279251s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 active pruub 102.589569092s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759297371s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759297371s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759297371s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] exit Start 0.000014 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759297371s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759241104s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589569092s@ mbc={}] exit Reset 0.000080 1 0.000141
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759241104s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589569092s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759241104s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589569092s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759241104s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589569092s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759241104s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589569092s@ mbc={}] exit Start 0.000009 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759241104s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589569092s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.240848 14 0.000057
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.250622 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.250781 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 11.250838 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.759024620s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 active pruub 102.589576721s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 11.247086 14 0.000111
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 11.251487 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 11.251875 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 11.251911 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.758980751s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] exit Reset 0.000155 1 0.000150
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.753028870s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 active pruub 102.583694458s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.758980751s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.758980751s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.758980751s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.758980751s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] exit Start 0.000018 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.758980751s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.589576721s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.752966881s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.583694458s@ mbc={}] exit Reset 0.000101 1 0.000140
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.752966881s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.583694458s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.752966881s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.583694458s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.752966881s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.583694458s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.752966881s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.583694458s@ mbc={}] exit Start 0.000013 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 43 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43 pruub=12.752966881s) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 102.583694458s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:09.970294+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 346283 data_alloc: 218103808 data_used: 0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63340544 unmapped: 614400 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 43 handle_osd_map epochs [43,44], i have 43, src has [1,44]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 44 handle_osd_map epochs [44,44], i have 44, src has [1,44]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 0.927665 7 0.000162
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 0.928352 7 0.000309
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 0.934677 7 0.000124
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000180 1 0.000169
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 0.936866 7 0.000117
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000160 1 0.000201
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 DELETING pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.004349 1 0.000080
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.004647 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.a( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started 0.939437 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.013320 2 0.000085
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.013400 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000093 1 0.000145
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 DELETING pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.068389 1 0.000068
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.068666 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.2( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started 1.005692 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.080754 2 0.000083
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.080792 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000127 1 0.000076
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 DELETING pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.086256 2 0.000173
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.086419 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.6( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.027580 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 DELETING pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.025930 2 0.000196
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.026140 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 44 pg[6.e( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=43) [1] r=-1 lpr=43 pi=[37,43)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.035365 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:10.970446+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 15 sent 13 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:40.334065+0000 osd.0 (osd.0) 14 : cluster [DBG] 4.b scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:40.344634+0000 osd.0 (osd.0) 15 : cluster [DBG] 4.b scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63356928 unmapped: 598016 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 15)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:40.334065+0000 osd.0 (osd.0) 14 : cluster [DBG] 4.b scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:40.344634+0000 osd.0 (osd.0) 15 : cluster [DBG] 4.b scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:11.970635+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63406080 unmapped: 548864 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:12.970782+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 17 sent 15 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:42.388961+0000 osd.0 (osd.0) 16 : cluster [DBG] 4.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:42.399727+0000 osd.0 (osd.0) 17 : cluster [DBG] 4.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63406080 unmapped: 548864 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 17)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:42.388961+0000 osd.0 (osd.0) 16 : cluster [DBG] 4.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:42.399727+0000 osd.0 (osd.0) 17 : cluster [DBG] 4.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 44 heartbeat osd_stat(store_statfs(0x4fe151000/0x0/0x4ffc00000, data 0x2f843/0x77000, compress 0x0/0x0/0x0, omap 0x8558, meta 0x1a27aa8), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:13.971014+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63414272 unmapped: 540672 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:14.971175+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 345577 data_alloc: 218103808 data_used: 0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.764271736s of 10.987809181s, submitted: 361
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63430656 unmapped: 524288 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:15.971373+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 19 sent 17 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:45.331629+0000 osd.0 (osd.0) 18 : cluster [DBG] 4.19 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:45.342117+0000 osd.0 (osd.0) 19 : cluster [DBG] 4.19 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63438848 unmapped: 516096 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 19)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:45.331629+0000 osd.0 (osd.0) 18 : cluster [DBG] 4.19 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:45.342117+0000 osd.0 (osd.0) 19 : cluster [DBG] 4.19 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:16.971645+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 21 sent 19 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:46.372642+0000 osd.0 (osd.0) 20 : cluster [DBG] 4.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:46.383201+0000 osd.0 (osd.0) 21 : cluster [DBG] 4.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63438848 unmapped: 516096 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 44 handle_osd_map epochs [45,45], i have 44, src has [1,45]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 21)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:46.372642+0000 osd.0 (osd.0) 20 : cluster [DBG] 4.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:46.383201+0000 osd.0 (osd.0) 21 : cluster [DBG] 4.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=0 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000091 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=0 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000027
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000145 1 0.000078
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=0 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000098 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=0 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000030 1 0.000056
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000077 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000109 1 0.000381
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=0 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000111 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=0 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000027 1 0.000091
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000032 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000226 1 0.000116
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=0 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000063 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=0 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000029 1 0.000052
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000075 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000098 1 0.000256
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetLog 0.002565 2 0.000058
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetLog 0.001532 2 0.000091
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetMissing 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/GetLog 0.001031 2 0.000071
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetLog 0.002414 2 0.000107
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/GetMissing 0.000019 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 45 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:17.971906+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 23 sent 21 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:47.415736+0000 osd.0 (osd.0) 22 : cluster [DBG] 4.0 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:47.426384+0000 osd.0 (osd.0) 23 : cluster [DBG] 4.0 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.c scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.c scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63488000 unmapped: 1515520 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 23)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:47.415736+0000 osd.0 (osd.0) 22 : cluster [DBG] 4.0 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:47.426384+0000 osd.0 (osd.0) 23 : cluster [DBG] 4.0 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 45 handle_osd_map epochs [45,46], i have 45, src has [1,46]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 45 handle_osd_map epochs [46,46], i have 46, src has [1,46]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.013252 2 0.000151
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=3 mbc={}] exit Started/Primary/Peering 1.014527 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 unknown m=3 mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.013133 2 0.000362
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering 1.015909 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.013338 2 0.000502
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 unknown m=1 mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering 1.015209 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 unknown m=2 mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 activating+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 activating+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.014109 2 0.000088
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 peering m=1 mbc={}] exit Started/Primary/Peering 1.016874 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 unknown m=1 mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/Activating 0.001678 4 0.000200
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/Activating 0.001995 4 0.000364
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000160 1 0.000277
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000019 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/Activating 0.004068 5 0.000087
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/Activating 0.004439 5 0.000253
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.007908 2 0.000101
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.008132 2 0.000039
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000015 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 lc 31'1 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:18.972089+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 25 sent 23 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:48.376848+0000 osd.0 (osd.0) 24 : cluster [DBG] 4.c scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:48.387400+0000 osd.0 (osd.0) 25 : cluster [DBG] 4.c scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.193705 1 0.000101
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.199582 1 0.000091
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000005 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 lc 31'21 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000080 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.058101 1 0.000070
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000020 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.7( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.257687 1 0.000033
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000064 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.155542 1 0.000207
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000018 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 46 pg[6.3( v 31'39 (0'0,31'39] local-lis/les=45/46 n=2 ec=37/21 lis/c=45/41 les/c/f=46/42/0 sis=45) [0] r=0 lpr=45 pi=[41,45)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 1400832 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 46 heartbeat osd_stat(store_statfs(0x4fe14a000/0x0/0x4ffc00000, data 0x32561/0x80000, compress 0x0/0x0/0x0, omap 0xa42c, meta 0x1a25bd4), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 25)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:48.376848+0000 osd.0 (osd.0) 24 : cluster [DBG] 4.c scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:48.387400+0000 osd.0 (osd.0) 25 : cluster [DBG] 4.c scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:19.972269+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 377187 data_alloc: 218103808 data_used: 0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.17 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.17 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 1400832 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:20.972407+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 27 sent 25 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:50.378452+0000 osd.0 (osd.0) 26 : cluster [DBG] 4.17 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:50.389064+0000 osd.0 (osd.0) 27 : cluster [DBG] 4.17 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 1400832 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 46 handle_osd_map epochs [47,48], i have 46, src has [1,48]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 27)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:50.378452+0000 osd.0 (osd.0) 26 : cluster [DBG] 4.17 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:50.389064+0000 osd.0 (osd.0) 27 : cluster [DBG] 4.17 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=0 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000109 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=0 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000017 1 0.000035
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000175 1 0.000053
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=0 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000386 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=0 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000046 1 0.000140
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000156 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000202 1 0.000392
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 23.287009 27 0.000166
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 23.296512 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 23.296699 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 23.296812 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712948799s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 active pruub 110.589859009s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712782860s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.589859009s@ mbc={}] exit Reset 0.000219 2 0.000352
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712782860s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.589859009s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712782860s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.589859009s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712782860s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.589859009s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712782860s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.589859009s@ mbc={}] exit Start 0.000084 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.712782860s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.589859009s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetLog 0.002589 2 0.000485
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetMissing 0.000008 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 23.294264 27 0.000169
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 23.298591 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 23.298683 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 23.298770 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 47 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.706089020s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 active pruub 110.583999634s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetLog 0.001874 2 0.000827
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/GetMissing 0.000013 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.705857277s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.583999634s@ mbc={}] exit Reset 0.000285 2 0.000368
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.705857277s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.583999634s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.705857277s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.583999634s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.705857277s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.583999634s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.705857277s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.583999634s@ mbc={}] exit Start 0.000041 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 48 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47 pruub=8.705857277s) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 110.583999634s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:21.972690+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63676416 unmapped: 1327104 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 48 handle_osd_map epochs [48,49], i have 48, src has [1,49]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.004350 2 0.000091
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering 1.006651 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 unknown m=2 mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 49 handle_osd_map epochs [48,49], i have 49, src has [1,49]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering/WaitUpThru 1.005753 2 0.000085
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 peering m=2 mbc={}] exit Started/Primary/Peering 1.008662 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 0'0 unknown m=2 mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/Activating 0.003282 3 0.000640
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.000377 1 0.000060
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000022 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 lc 31'13 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/Activating 0.004180 3 0.000229
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitLocalRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.009720 7 0.000262
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.010798 7 0.000243
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 49 handle_osd_map epochs [49,49], i have 49, src has [1,49]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.079668 3 0.000141
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000018 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitLocalRecoveryReserved 0.078710 3 0.000091
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/WaitRemoteRecoveryReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] exit Started/Primary/Active/WaitRemoteRecoveryReserved 0.000009 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 lc 31'11 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] enter Started/Primary/Active/Recovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.078761 2 0.000075
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.078802 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:22.972853+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovering 0.139858 1 0.000092
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active/Recovered 0.000015 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.5( v 31'39 (0'0,31'39] local-lis/les=48/49 n=2 ec=37/21 lis/c=48/41 les/c/f=49/42/0 sis=48) [0] r=0 lpr=48 pi=[41,48)/1 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.139545 1 0.000229
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 DELETING pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.133996 2 0.000399
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.273680 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.c( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.363567 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.354986 2 0.000068
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ReplicaActive 0.355029 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000137 1 0.000087
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 DELETING pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete/Deleting 0.034134 2 0.000169
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started/ToDelete 0.034353 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 49 pg[6.4( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=2 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=47) [1] r=-1 lpr=47 pi=[37,47)/1 pct=0'0 crt=31'39 lcod 0'0 active mbc={}] exit Started 1.399227 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 49 heartbeat osd_stat(store_statfs(0x4fe144000/0x0/0x4ffc00000, data 0x3518d/0x86000, compress 0x0/0x0/0x0, omap 0xa6de, meta 0x1a25922), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63684608 unmapped: 1318912 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 49 handle_osd_map epochs [49,50], i have 49, src has [1,50]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:23.973038+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63692800 unmapped: 1310720 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:24.973193+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 390366 data_alloc: 218103808 data_used: 0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63709184 unmapped: 1294336 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 50 handle_osd_map epochs [50,51], i have 50, src has [1,51]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.419502258s of 10.535414696s, submitted: 68
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:25.973372+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63766528 unmapped: 1236992 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:26.973606+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 29 sent 27 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:56.231243+0000 osd.0 (osd.0) 28 : cluster [DBG] 4.16 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:56.241823+0000 osd.0 (osd.0) 29 : cluster [DBG] 4.16 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63766528 unmapped: 1236992 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 29)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:56.231243+0000 osd.0 (osd.0) 28 : cluster [DBG] 4.16 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:56.241823+0000 osd.0 (osd.0) 29 : cluster [DBG] 4.16 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:27.973845+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63791104 unmapped: 1212416 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:28.974057+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 31 sent 29 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:58.268201+0000 osd.0 (osd.0) 30 : cluster [DBG] 4.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:10:58.278889+0000 osd.0 (osd.0) 31 : cluster [DBG] 4.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 51 heartbeat osd_stat(store_statfs(0x4fe13c000/0x0/0x4ffc00000, data 0x38c8b/0x8e000, compress 0x0/0x0/0x0, omap 0xb292, meta 0x1a24d6e), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 51 handle_osd_map epochs [52,52], i have 51, src has [1,52]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active+clean] exit Started/Primary/Active/Clean 30.644972 41 0.000145
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active 30.654934 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary 30.655271 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started 30.655312 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=37) [0] r=0 lpr=37 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355226517s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 active pruub 118.589836121s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355180740s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.589836121s@ mbc={}] exit Reset 0.000124 1 0.000191
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355180740s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.589836121s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355180740s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.589836121s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355180740s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.589836121s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355180740s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.589836121s@ mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 52 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52 pruub=9.355180740s) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY pruub 118.589836121s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 63815680 unmapped: 1187840 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 31)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:58.268201+0000 osd.0 (osd.0) 30 : cluster [DBG] 4.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:10:58.278889+0000 osd.0 (osd.0) 31 : cluster [DBG] 4.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:29.974273+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 52 handle_osd_map epochs [53,53], i have 52, src has [1,53]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/Stray 1.128307 6 0.000083
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000827 2 0.000061
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=-1 lpr=52 DELETING pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete/Deleting 0.002101 1 0.000039
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started/ToDelete 0.002995 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.8( v 31'39 (0'0,31'39] lb MIN local-lis/les=37/38 n=1 ec=37/21 lis/c=37/37 les/c/f=38/38/0 sis=52) [2] r=-1 lpr=52 pi=[37,52)/1 crt=31'39 lcod 0'0 unknown NOTIFY mbc={}] exit Started 1.131368 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 401042 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 64897024 unmapped: 106496 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:30.974434+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 64937984 unmapped: 65536 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=0 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000095 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=0 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000009 1 0.000017
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000141 1 0.000053
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000281 2 0.000056
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000006 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 53 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 53 handle_osd_map epochs [54,54], i have 53, src has [1,54]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 53 handle_osd_map epochs [54,54], i have 54, src has [1,54]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a(unlocked)] enter Initial
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=0 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Initial 0.000117 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=0 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Reset 0.000013 1 0.000031
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] exit Start 0.000013 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] enter Started/Primary/Peering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetInfo
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetInfo 0.000381 1 0.000061
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( empty local-lis/les=0/0 n=0 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=0'0 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetLog
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 0.391484 2 0.000088
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 0.391972 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=41/42 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: merge_log_dups log.dups.size()=0olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: end of merge_log_dups changed=0 log.dups.size()=0 olog.dups.size()=0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetLog 0.000455 2 0.000058
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/GetMissing
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/GetMissing 0.000004 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 mlcod 0'0 peering mbc={}] enter Started/Primary/Peering/WaitUpThru
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=37/21 lis/c=41/41 les/c/f=42/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=37/21 lis/c=53/41 les/c/f=54/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.003357 4 0.000135
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=37/21 lis/c=53/41 les/c/f=54/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=37/21 lis/c=53/41 les/c/f=54/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000007 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 54 pg[6.9( v 31'39 (0'0,31'39] local-lis/les=53/54 n=1 ec=37/21 lis/c=53/41 les/c/f=54/42/0 sis=53) [0] r=0 lpr=53 pi=[41,53)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:31.974623+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 33 sent 31 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:01.288792+0000 osd.0 (osd.0) 32 : cluster [DBG] 2.13 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:01.299328+0000 osd.0 (osd.0) 33 : cluster [DBG] 2.13 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 64970752 unmapped: 32768 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 54 handle_osd_map epochs [55,55], i have 54, src has [1,55]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering/WaitUpThru 1.009906 2 0.000040
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 mlcod 0'0 peering mbc={}] exit Started/Primary/Peering 1.010814 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 lc 0'0 (0'0,31'39] local-lis/les=43/44 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 mlcod 0'0 unknown mbc={}] enter Started/Primary/Active
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=54/55 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 mlcod 0'0 activating mbc={}] enter Started/Primary/Active/Activating
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 33)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:01.288792+0000 osd.0 (osd.0) 32 : cluster [DBG] 2.13 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:01.299328+0000 osd.0 (osd.0) 33 : cluster [DBG] 2.13 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=54/55 n=1 ec=37/21 lis/c=43/43 les/c/f=44/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=54/55 n=1 ec=37/21 lis/c=54/43 les/c/f=55/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Activating 0.002750 3 0.000225
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=54/55 n=1 ec=37/21 lis/c=54/43 les/c/f=55/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Recovered
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=54/55 n=1 ec=37/21 lis/c=54/43 les/c/f=55/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] exit Started/Primary/Active/Recovered 0.000015 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 55 pg[6.a( v 31'39 (0'0,31'39] local-lis/les=54/55 n=1 ec=37/21 lis/c=54/43 les/c/f=55/44/0 sis=54) [0] r=0 lpr=54 pi=[43,54)/1 crt=31'39 lcod 0'0 mlcod 0'0 active mbc={}] enter Started/Primary/Active/Clean
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:32.974792+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 35 sent 33 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:02.246323+0000 osd.0 (osd.0) 34 : cluster [DBG] 5.14 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:02.256901+0000 osd.0 (osd.0) 35 : cluster [DBG] 5.14 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 55 handle_osd_map epochs [55,55], i have 55, src has [1,55]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 64987136 unmapped: 16384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:33.975146+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 35)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:02.246323+0000 osd.0 (osd.0) 34 : cluster [DBG] 5.14 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:02.256901+0000 osd.0 (osd.0) 35 : cluster [DBG] 5.14 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 55 heartbeat osd_stat(store_statfs(0x4fe12e000/0x0/0x4ffc00000, data 0x3e35d/0x9a000, compress 0x0/0x0/0x0, omap 0xbcb0, meta 0x1a24350), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 55 handle_osd_map epochs [56,56], i have 55, src has [1,56]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 15.680893 27 0.000118
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 15.691074 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 16.707129 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 16.707305 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310461044s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 active pruub 122.845947266s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310279846s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY pruub 122.845947266s@ mbc={}] exit Reset 0.000479 1 0.000669
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310279846s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY pruub 122.845947266s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310279846s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY pruub 122.845947266s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310279846s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY pruub 122.845947266s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310279846s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY pruub 122.845947266s@ mbc={}] exit Start 0.000095 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 56 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56 pruub=8.310279846s) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY pruub 122.845947266s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 56 handle_osd_map epochs [56,56], i have 56, src has [1,56]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65019904 unmapped: 1032192 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:34.975347+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 56 handle_osd_map epochs [57,57], i have 56, src has [1,57]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 0.534277 6 0.000332
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.009680 3 0.000264
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.009800 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000137 1 0.000106
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 DELETING pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.010267 2 0.000375
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.010520 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 57 pg[6.b( v 31'39 (0'0,31'39] lb MIN local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=56) [1] r=-1 lpr=56 pi=[45,56)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 0.554877 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 416848 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65060864 unmapped: 991232 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:35.975561+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 57 handle_osd_map epochs [57,58], i have 57, src has [1,58]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.059386253s of 10.144032478s, submitted: 35
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65077248 unmapped: 974848 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:36.975779+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:37.975946+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65126400 unmapped: 925696 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:38.976167+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 37 sent 35 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:08.424360+0000 osd.0 (osd.0) 36 : cluster [DBG] 3.12 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:08.434891+0000 osd.0 (osd.0) 37 : cluster [DBG] 3.12 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 37)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:08.424360+0000 osd.0 (osd.0) 36 : cluster [DBG] 3.12 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:08.434891+0000 osd.0 (osd.0) 37 : cluster [DBG] 3.12 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 58 heartbeat osd_stat(store_statfs(0x4fe123000/0x0/0x4ffc00000, data 0x42409/0xa3000, compress 0x0/0x0/0x0, omap 0xc49b, meta 0x1a23b65), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 58 handle_osd_map epochs [59,59], i have 58, src has [1,59]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=48) [0] r=0 lpr=48 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 16.265585 30 0.000325
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=48) [0] r=0 lpr=48 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 16.349229 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=48) [0] r=0 lpr=48 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 17.355977 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=48) [0] r=0 lpr=48 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 17.356235 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=48) [0] r=0 lpr=48 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654327393s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 active pruub 134.886367798s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654233932s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY pruub 134.886367798s@ mbc={}] exit Reset 0.000189 1 0.000291
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654233932s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY pruub 134.886367798s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654233932s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY pruub 134.886367798s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654233932s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY pruub 134.886367798s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654233932s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY pruub 134.886367798s@ mbc={}] exit Start 0.000010 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 59 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59 pruub=15.654233932s) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY pruub 134.886367798s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 59 handle_osd_map epochs [59,59], i have 59, src has [1,59]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65159168 unmapped: 892928 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:39.976412+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 39 sent 37 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:09.447799+0000 osd.0 (osd.0) 38 : cluster [DBG] 5.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:09.458190+0000 osd.0 (osd.0) 39 : cluster [DBG] 5.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 39)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:09.447799+0000 osd.0 (osd.0) 38 : cluster [DBG] 5.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:09.458190+0000 osd.0 (osd.0) 39 : cluster [DBG] 5.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 59 handle_osd_map epochs [60,60], i have 59, src has [1,60]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 0.915390 6 0.000095
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.069548 3 0.000078
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.069595 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000157 1 0.000100
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 DELETING pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.017862 2 0.000254
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.018097 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 60 pg[6.d( v 31'39 (0'0,31'39] lb MIN local-lis/les=48/49 n=1 ec=37/21 lis/c=48/48 les/c/f=49/49/0 sis=59) [1] r=-1 lpr=59 pi=[48,59)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 1.003152 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 424978 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:40.976695+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 41 sent 39 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:10.440445+0000 osd.0 (osd.0) 40 : cluster [DBG] 2.11 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:10.450739+0000 osd.0 (osd.0) 41 : cluster [DBG] 2.11 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 41)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:10.440445+0000 osd.0 (osd.0) 40 : cluster [DBG] 2.11 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:10.450739+0000 osd.0 (osd.0) 41 : cluster [DBG] 2.11 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:42.012615+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:43.012818+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 60 handle_osd_map epochs [61,61], i have 60, src has [1,61]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active+clean] exit Started/Primary/Active/Clean 24.681634 44 0.000408
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary/Active 24.885857 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started/Primary 25.900440 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active mbc={255={}}] exit Started 25.900622 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=45) [0] r=0 lpr=45 crt=31'39 mlcod 31'39 active mbc={255={}}] enter Reset
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116504669s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 active pruub 138.846450806s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4541880224203014143 upacting 4541880224203014143
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116334915s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY pruub 138.846450806s@ mbc={}] exit Reset 0.000235 1 0.000387
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116334915s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY pruub 138.846450806s@ mbc={}] enter Started
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116334915s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY pruub 138.846450806s@ mbc={}] enter Start
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116334915s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY pruub 138.846450806s@ mbc={}] state<Start>: transitioning to Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116334915s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY pruub 138.846450806s@ mbc={}] exit Start 0.000064 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 61 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=15.116334915s) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY pruub 138.846450806s@ mbc={}] enter Started/Stray
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 61 handle_osd_map epochs [60,61], i have 61, src has [1,61]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:44.013011+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 61 handle_osd_map epochs [62,62], i have 61, src has [1,62]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY mbc={}] exit Started/Stray 0.767302 6 0.000233
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 crt=31'39 unknown NOTIFY mbc={}] enter Started/ReplicaActive/RepNotRecovering
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive/RepNotRecovering 0.128796 3 0.000095
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ReplicaActive 0.128886 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/WaitDeleteReserved
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/WaitDeleteReserved 0.000093 1 0.000176
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] enter Started/ToDelete/Deleting
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 DELETING pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete/Deleting 0.024807 2 0.000204
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] exit Started/ToDelete 0.024999 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 pg_epoch: 62 pg[6.f( v 31'39 (0'0,31'39] lb MIN local-lis/les=45/46 n=1 ec=37/21 lis/c=45/45 les/c/f=46/46/0 sis=61) [2] r=-1 lpr=61 pi=[45,61)/1 pct=0'0 crt=31'39 active mbc={}] exit Started 0.921396 0 0.000000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:45.013212+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe11f000/0x0/0x4ffc00000, data 0x4660d/0xab000, compress 0x0/0x0/0x0, omap 0xcc61, meta 0x1a2339f), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 427345 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:46.013427+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.345547676s of 10.401384354s, submitted: 23
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65036288 unmapped: 1015808 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:47.013658+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 43 sent 41 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:16.412520+0000 osd.0 (osd.0) 42 : cluster [DBG] 3.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:16.423056+0000 osd.0 (osd.0) 43 : cluster [DBG] 3.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 43)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:16.412520+0000 osd.0 (osd.0) 42 : cluster [DBG] 3.15 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:16.423056+0000 osd.0 (osd.0) 43 : cluster [DBG] 3.15 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65036288 unmapped: 1015808 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:48.014161+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 45 sent 43 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:17.408326+0000 osd.0 (osd.0) 44 : cluster [DBG] 3.17 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:17.418662+0000 osd.0 (osd.0) 45 : cluster [DBG] 3.17 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 45)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:17.408326+0000 osd.0 (osd.0) 44 : cluster [DBG] 3.17 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:17.418662+0000 osd.0 (osd.0) 45 : cluster [DBG] 3.17 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65060864 unmapped: 991232 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:49.014494+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 47 sent 45 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:18.437073+0000 osd.0 (osd.0) 46 : cluster [DBG] 7.1b scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:18.447245+0000 osd.0 (osd.0) 47 : cluster [DBG] 7.1b scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 47)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:18.437073+0000 osd.0 (osd.0) 46 : cluster [DBG] 7.1b scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:18.447245+0000 osd.0 (osd.0) 47 : cluster [DBG] 7.1b scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65093632 unmapped: 958464 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:50.014916+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 49 sent 47 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:19.465465+0000 osd.0 (osd.0) 48 : cluster [DBG] 3.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:19.475334+0000 osd.0 (osd.0) 49 : cluster [DBG] 3.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 49)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:19.465465+0000 osd.0 (osd.0) 48 : cluster [DBG] 3.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:19.475334+0000 osd.0 (osd.0) 49 : cluster [DBG] 3.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 438222 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65101824 unmapped: 950272 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:51.015177+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 51 sent 49 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:20.432654+0000 osd.0 (osd.0) 50 : cluster [DBG] 2.8 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:20.443191+0000 osd.0 (osd.0) 51 : cluster [DBG] 2.8 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 51)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:20.432654+0000 osd.0 (osd.0) 50 : cluster [DBG] 2.8 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:20.443191+0000 osd.0 (osd.0) 51 : cluster [DBG] 2.8 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65101824 unmapped: 950272 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:52.015439+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65110016 unmapped: 942080 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:53.015659+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 53 sent 51 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:22.445117+0000 osd.0 (osd.0) 52 : cluster [DBG] 3.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:22.455656+0000 osd.0 (osd.0) 53 : cluster [DBG] 3.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 53)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:22.445117+0000 osd.0 (osd.0) 52 : cluster [DBG] 3.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:22.455656+0000 osd.0 (osd.0) 53 : cluster [DBG] 3.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:54.015926+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:55.016107+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 440633 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65126400 unmapped: 925696 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:56.016250+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65126400 unmapped: 925696 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:57.016392+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.918884277s of 10.996058464s, submitted: 12
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65134592 unmapped: 917504 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:58.016539+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 55 sent 53 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:27.408554+0000 osd.0 (osd.0) 54 : cluster [DBG] 7.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:27.419058+0000 osd.0 (osd.0) 55 : cluster [DBG] 7.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 55)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:27.408554+0000 osd.0 (osd.0) 54 : cluster [DBG] 7.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:27.419058+0000 osd.0 (osd.0) 55 : cluster [DBG] 7.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65150976 unmapped: 901120 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:10:59.016801+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65150976 unmapped: 901120 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:00.017479+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 445457 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:01.017727+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 57 sent 55 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:30.383198+0000 osd.0 (osd.0) 56 : cluster [DBG] 2.1f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:30.393763+0000 osd.0 (osd.0) 57 : cluster [DBG] 2.1f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 57)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:30.383198+0000 osd.0 (osd.0) 56 : cluster [DBG] 2.1f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:30.393763+0000 osd.0 (osd.0) 57 : cluster [DBG] 2.1f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:02.018130+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:03.018410+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 59 sent 57 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:32.434529+0000 osd.0 (osd.0) 58 : cluster [DBG] 3.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:32.445065+0000 osd.0 (osd.0) 59 : cluster [DBG] 3.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 59)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:32.434529+0000 osd.0 (osd.0) 58 : cluster [DBG] 3.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:32.445065+0000 osd.0 (osd.0) 59 : cluster [DBG] 3.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:04.018698+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:05.018902+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 61 sent 59 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:34.395811+0000 osd.0 (osd.0) 60 : cluster [DBG] 7.13 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:34.405960+0000 osd.0 (osd.0) 61 : cluster [DBG] 7.13 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 61)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:34.395811+0000 osd.0 (osd.0) 60 : cluster [DBG] 7.13 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:34.405960+0000 osd.0 (osd.0) 61 : cluster [DBG] 7.13 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 452694 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:06.019239+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 63 sent 61 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:35.416367+0000 osd.0 (osd.0) 62 : cluster [DBG] 2.16 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:35.426254+0000 osd.0 (osd.0) 63 : cluster [DBG] 2.16 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 63)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:35.416367+0000 osd.0 (osd.0) 62 : cluster [DBG] 2.16 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:35.426254+0000 osd.0 (osd.0) 63 : cluster [DBG] 2.16 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:07.019610+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:08.019805+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:09.020082+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:10.020364+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 452694 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:11.020637+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.748119354s of 13.868241310s, submitted: 10
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:12.020892+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:41.276897+0000 osd.0 (osd.0) 64 : cluster [DBG] 7.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:41.287254+0000 osd.0 (osd.0) 65 : cluster [DBG] 7.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 65)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:41.276897+0000 osd.0 (osd.0) 64 : cluster [DBG] 7.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:41.287254+0000 osd.0 (osd.0) 65 : cluster [DBG] 7.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:13.021166+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:14.021429+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:15.021638+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 455105 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:16.021832+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:17.022052+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:46.158534+0000 osd.0 (osd.0) 66 : cluster [DBG] 2.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:46.169083+0000 osd.0 (osd.0) 67 : cluster [DBG] 2.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 67)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:46.158534+0000 osd.0 (osd.0) 66 : cluster [DBG] 2.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:46.169083+0000 osd.0 (osd.0) 67 : cluster [DBG] 2.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:18.022310+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:19.022521+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:20.022717+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 459927 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 1843200 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:21.022867+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:50.216276+0000 osd.0 (osd.0) 68 : cluster [DBG] 5.5 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:50.226827+0000 osd.0 (osd.0) 69 : cluster [DBG] 5.5 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 69)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:50.216276+0000 osd.0 (osd.0) 68 : cluster [DBG] 5.5 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:50.226827+0000 osd.0 (osd.0) 69 : cluster [DBG] 5.5 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 1835008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:22.023141+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.936560631s of 10.947173119s, submitted: 6
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 1826816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:23.023319+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:52.223376+0000 osd.0 (osd.0) 70 : cluster [DBG] 7.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:52.233798+0000 osd.0 (osd.0) 71 : cluster [DBG] 7.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 71)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:52.223376+0000 osd.0 (osd.0) 70 : cluster [DBG] 7.6 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:52.233798+0000 osd.0 (osd.0) 71 : cluster [DBG] 7.6 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 1826816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:24.023636+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 1826816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:25.023818+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 462338 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 1818624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:26.023943+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 1818624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:27.024110+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 1810432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:28.024250+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 1802240 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:29.024458+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:58.273350+0000 osd.0 (osd.0) 72 : cluster [DBG] 2.2 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:58.284111+0000 osd.0 (osd.0) 73 : cluster [DBG] 2.2 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 73)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:58.273350+0000 osd.0 (osd.0) 72 : cluster [DBG] 2.2 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:58.284111+0000 osd.0 (osd.0) 73 : cluster [DBG] 2.2 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 1794048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:30.024632+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 464749 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 1794048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:31.024827+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 1794048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:32.025040+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.101898193s of 10.109881401s, submitted: 4
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65323008 unmapped: 1777664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:33.025242+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:02.333955+0000 osd.0 (osd.0) 74 : cluster [DBG] 5.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:02.344481+0000 osd.0 (osd.0) 75 : cluster [DBG] 5.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 75)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:02.333955+0000 osd.0 (osd.0) 74 : cluster [DBG] 5.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:02.344481+0000 osd.0 (osd.0) 75 : cluster [DBG] 5.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65323008 unmapped: 1777664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:34.025438+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65323008 unmapped: 1777664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:35.025622+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 467160 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65331200 unmapped: 1769472 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:36.025756+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65331200 unmapped: 1769472 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:37.026099+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65339392 unmapped: 1761280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:38.026252+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65339392 unmapped: 1761280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:39.026425+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65339392 unmapped: 1761280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:40.026654+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:09.353971+0000 osd.0 (osd.0) 76 : cluster [DBG] 7.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:09.364531+0000 osd.0 (osd.0) 77 : cluster [DBG] 7.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 77)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:09.353971+0000 osd.0 (osd.0) 76 : cluster [DBG] 7.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:09.364531+0000 osd.0 (osd.0) 77 : cluster [DBG] 7.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 469571 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65347584 unmapped: 1753088 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:41.026843+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65347584 unmapped: 1753088 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:42.027061+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65396736 unmapped: 1703936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:43.027254+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:12.302360+0000 osd.0 (osd.0) 78 : cluster [DBG] 5.4 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:12.312892+0000 osd.0 (osd.0) 79 : cluster [DBG] 5.4 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 79)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:12.302360+0000 osd.0 (osd.0) 78 : cluster [DBG] 5.4 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:12.312892+0000 osd.0 (osd.0) 79 : cluster [DBG] 5.4 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65396736 unmapped: 1703936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:44.027505+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.910529137s of 11.991785049s, submitted: 6
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65404928 unmapped: 1695744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:45.027667+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:14.325859+0000 osd.0 (osd.0) 80 : cluster [DBG] 7.18 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:14.336396+0000 osd.0 (osd.0) 81 : cluster [DBG] 7.18 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 81)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:14.325859+0000 osd.0 (osd.0) 80 : cluster [DBG] 7.18 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:14.336396+0000 osd.0 (osd.0) 81 : cluster [DBG] 7.18 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 474395 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65404928 unmapped: 1695744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:46.027840+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65413120 unmapped: 1687552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:47.028040+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.c scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.c scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65421312 unmapped: 1679360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:48.028181+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:17.335608+0000 osd.0 (osd.0) 82 : cluster [DBG] 3.c scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:17.346098+0000 osd.0 (osd.0) 83 : cluster [DBG] 3.c scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 83)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:17.335608+0000 osd.0 (osd.0) 82 : cluster [DBG] 3.c scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:17.346098+0000 osd.0 (osd.0) 83 : cluster [DBG] 3.c scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65421312 unmapped: 1679360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:49.028401+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65421312 unmapped: 1679360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:50.028556+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 476806 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65429504 unmapped: 1671168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:51.028752+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65437696 unmapped: 1662976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:52.028893+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65445888 unmapped: 1654784 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:53.029004+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65462272 unmapped: 1638400 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:54.029195+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:23.406202+0000 osd.0 (osd.0) 84 : cluster [DBG] 5.7 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:23.416740+0000 osd.0 (osd.0) 85 : cluster [DBG] 5.7 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 85)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:23.406202+0000 osd.0 (osd.0) 84 : cluster [DBG] 5.7 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:23.416740+0000 osd.0 (osd.0) 85 : cluster [DBG] 5.7 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65462272 unmapped: 1638400 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:55.029538+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.031082153s of 11.042010307s, submitted: 6
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 481628 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65478656 unmapped: 1622016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:56.029892+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:25.367891+0000 osd.0 (osd.0) 86 : cluster [DBG] 7.4 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:25.378494+0000 osd.0 (osd.0) 87 : cluster [DBG] 7.4 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 87)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:25.367891+0000 osd.0 (osd.0) 86 : cluster [DBG] 7.4 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:25.378494+0000 osd.0 (osd.0) 87 : cluster [DBG] 7.4 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65478656 unmapped: 1622016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:57.030451+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65478656 unmapped: 1622016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:58.030758+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65495040 unmapped: 1605632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:59.031052+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65503232 unmapped: 1597440 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:00.031280+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:29.329769+0000 osd.0 (osd.0) 88 : cluster [DBG] 5.2 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:29.340340+0000 osd.0 (osd.0) 89 : cluster [DBG] 5.2 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 89)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:29.329769+0000 osd.0 (osd.0) 88 : cluster [DBG] 5.2 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:29.340340+0000 osd.0 (osd.0) 89 : cluster [DBG] 5.2 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 486452 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65511424 unmapped: 1589248 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:01.031683+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:30.364905+0000 osd.0 (osd.0) 90 : cluster [DBG] 7.1f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:30.375515+0000 osd.0 (osd.0) 91 : cluster [DBG] 7.1f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 91)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:30.364905+0000 osd.0 (osd.0) 90 : cluster [DBG] 7.1f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:30.375515+0000 osd.0 (osd.0) 91 : cluster [DBG] 7.1f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65519616 unmapped: 1581056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:02.032142+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:31.387245+0000 osd.0 (osd.0) 92 : cluster [DBG] 3.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:31.397761+0000 osd.0 (osd.0) 93 : cluster [DBG] 3.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 93)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:31.387245+0000 osd.0 (osd.0) 92 : cluster [DBG] 3.f scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:31.397761+0000 osd.0 (osd.0) 93 : cluster [DBG] 3.f scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65519616 unmapped: 1581056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:03.032360+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65527808 unmapped: 1572864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:04.032692+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:33.406595+0000 osd.0 (osd.0) 94 : cluster [DBG] 3.1 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:33.417117+0000 osd.0 (osd.0) 95 : cluster [DBG] 3.1 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 95)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:33.406595+0000 osd.0 (osd.0) 94 : cluster [DBG] 3.1 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:33.417117+0000 osd.0 (osd.0) 95 : cluster [DBG] 3.1 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65536000 unmapped: 1564672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:05.032996+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 491274 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65536000 unmapped: 1564672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:06.033226+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.963118553s of 10.986808777s, submitted: 10
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65536000 unmapped: 1564672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:07.033394+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:36.354698+0000 osd.0 (osd.0) 96 : cluster [DBG] 2.18 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:36.365210+0000 osd.0 (osd.0) 97 : cluster [DBG] 2.18 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 97)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:36.354698+0000 osd.0 (osd.0) 96 : cluster [DBG] 2.18 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:36.365210+0000 osd.0 (osd.0) 97 : cluster [DBG] 2.18 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65552384 unmapped: 1548288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:08.033671+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:37.404891+0000 osd.0 (osd.0) 98 : cluster [DBG] 5.1e scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:37.415457+0000 osd.0 (osd.0) 99 : cluster [DBG] 5.1e scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 99)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:37.404891+0000 osd.0 (osd.0) 98 : cluster [DBG] 5.1e scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:37.415457+0000 osd.0 (osd.0) 99 : cluster [DBG] 5.1e scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65568768 unmapped: 1531904 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:09.033959+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:38.387164+0000 osd.0 (osd.0) 100 : cluster [DBG] 2.19 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:38.397699+0000 osd.0 (osd.0) 101 : cluster [DBG] 2.19 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 101)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:38.387164+0000 osd.0 (osd.0) 100 : cluster [DBG] 2.19 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:38.397699+0000 osd.0 (osd.0) 101 : cluster [DBG] 2.19 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65585152 unmapped: 1515520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:10.034217+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:39.416523+0000 osd.0 (osd.0) 102 : cluster [DBG] 6.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:39.434113+0000 osd.0 (osd.0) 103 : cluster [DBG] 6.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 103)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:39.416523+0000 osd.0 (osd.0) 102 : cluster [DBG] 6.3 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:39.434113+0000 osd.0 (osd.0) 103 : cluster [DBG] 6.3 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 500924 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65593344 unmapped: 1507328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:11.034471+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65601536 unmapped: 1499136 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:12.034730+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:41.416689+0000 osd.0 (osd.0) 104 : cluster [DBG] 6.0 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:41.441323+0000 osd.0 (osd.0) 105 : cluster [DBG] 6.0 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 105)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:41.416689+0000 osd.0 (osd.0) 104 : cluster [DBG] 6.0 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:41.441323+0000 osd.0 (osd.0) 105 : cluster [DBG] 6.0 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65617920 unmapped: 1482752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:13.035000+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:42.436100+0000 osd.0 (osd.0) 106 : cluster [DBG] 6.7 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:42.450325+0000 osd.0 (osd.0) 107 : cluster [DBG] 6.7 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 107)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:42.436100+0000 osd.0 (osd.0) 106 : cluster [DBG] 6.7 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:42.450325+0000 osd.0 (osd.0) 107 : cluster [DBG] 6.7 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65617920 unmapped: 1482752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:14.035245+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65626112 unmapped: 1474560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:15.035383+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:44.510801+0000 osd.0 (osd.0) 108 : cluster [DBG] 6.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:44.521337+0000 osd.0 (osd.0) 109 : cluster [DBG] 6.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 109)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:44.510801+0000 osd.0 (osd.0) 108 : cluster [DBG] 6.9 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:44.521337+0000 osd.0 (osd.0) 109 : cluster [DBG] 6.9 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 508157 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65626112 unmapped: 1474560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:16.035638+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65626112 unmapped: 1474560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:17.035785+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.056098938s of 11.086086273s, submitted: 14
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65634304 unmapped: 1466368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:18.035963+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:47.440877+0000 osd.0 (osd.0) 110 : cluster [DBG] 6.5 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:47.458547+0000 osd.0 (osd.0) 111 : cluster [DBG] 6.5 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 111)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:47.440877+0000 osd.0 (osd.0) 110 : cluster [DBG] 6.5 scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:47.458547+0000 osd.0 (osd.0) 111 : cluster [DBG] 6.5 scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65642496 unmapped: 1458176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:19.036205+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.a scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.a scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65658880 unmapped: 1441792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:20.036367+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:49.357916+0000 osd.0 (osd.0) 112 : cluster [DBG] 6.a scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:49.368546+0000 osd.0 (osd.0) 113 : cluster [DBG] 6.a scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 113)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:49.357916+0000 osd.0 (osd.0) 112 : cluster [DBG] 6.a scrub starts
Dec 03 21:34:47 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:49.368546+0000 osd.0 (osd.0) 113 : cluster [DBG] 6.a scrub ok
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65658880 unmapped: 1441792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:21.036599+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65658880 unmapped: 1441792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:22.036796+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65667072 unmapped: 1433600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:23.036998+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65667072 unmapped: 1433600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:24.037287+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:25.037554+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65675264 unmapped: 1425408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:26.037897+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65675264 unmapped: 1425408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:27.038106+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65683456 unmapped: 1417216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:28.038646+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65683456 unmapped: 1417216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:29.038895+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65683456 unmapped: 1417216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:30.039150+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 1400832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:31.039519+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 1400832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:32.039826+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 1400832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:33.040044+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65716224 unmapped: 1384448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:34.040320+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65716224 unmapped: 1384448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:35.040752+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65724416 unmapped: 1376256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:36.041133+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65724416 unmapped: 1376256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:37.041398+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65724416 unmapped: 1376256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:38.041633+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65732608 unmapped: 1368064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:39.041956+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65732608 unmapped: 1368064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:40.042255+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 1351680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:41.042700+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 1351680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:42.042992+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65757184 unmapped: 1343488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:43.043223+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65757184 unmapped: 1343488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:44.043744+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65757184 unmapped: 1343488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:45.044134+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65773568 unmapped: 1327104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:46.044336+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65773568 unmapped: 1327104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:47.044624+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65773568 unmapped: 1327104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:48.044831+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 1318912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:49.045340+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65789952 unmapped: 1310720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:50.045749+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65798144 unmapped: 1302528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:51.046036+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65798144 unmapped: 1302528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:52.046268+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65806336 unmapped: 1294336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:53.046510+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65814528 unmapped: 1286144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:54.046749+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65822720 unmapped: 1277952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:55.046923+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 1269760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:56.047077+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 1269760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:57.047269+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 1269760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:58.047443+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 1261568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:59.047627+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 1261568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:00.074811+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 1261568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:01.074993+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65847296 unmapped: 1253376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:02.075137+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 1245184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:03.075332+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65863680 unmapped: 1236992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:04.075916+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65863680 unmapped: 1236992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:05.076101+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65863680 unmapped: 1236992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:06.076327+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 1228800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:07.076628+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 1228800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:08.076799+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65880064 unmapped: 1220608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:09.076972+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65880064 unmapped: 1220608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:10.077102+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65880064 unmapped: 1220608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:11.077285+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65888256 unmapped: 1212416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:12.077487+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65888256 unmapped: 1212416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:13.077691+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65896448 unmapped: 1204224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:14.077965+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65904640 unmapped: 1196032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:15.078189+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65904640 unmapped: 1196032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:16.078461+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65912832 unmapped: 1187840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:17.078743+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65912832 unmapped: 1187840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:18.078943+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65912832 unmapped: 1187840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:19.079163+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65921024 unmapped: 1179648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:20.079339+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65921024 unmapped: 1179648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:21.079522+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65937408 unmapped: 1163264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:22.079734+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65937408 unmapped: 1163264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:23.079970+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 1155072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:24.080305+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 1155072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:25.080519+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 1155072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:26.080710+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:27.080946+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:28.081188+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:29.081503+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65961984 unmapped: 1138688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:30.081649+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65961984 unmapped: 1138688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:31.081848+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65970176 unmapped: 1130496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:32.082059+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65970176 unmapped: 1130496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:33.082248+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65970176 unmapped: 1130496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:34.082522+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65978368 unmapped: 1122304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:35.082774+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65978368 unmapped: 1122304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:36.082945+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65986560 unmapped: 1114112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:37.083217+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65986560 unmapped: 1114112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:38.083438+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65986560 unmapped: 1114112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:39.083636+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65994752 unmapped: 1105920 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:40.083919+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65994752 unmapped: 1105920 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:41.084161+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:42.084470+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:43.084786+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:44.086013+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 1081344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:45.086154+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 1081344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:46.086324+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 1081344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:47.086528+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66043904 unmapped: 1056768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:48.086726+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66043904 unmapped: 1056768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:49.086936+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66043904 unmapped: 1056768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:50.087190+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66052096 unmapped: 1048576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:51.087769+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66060288 unmapped: 1040384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:52.087972+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66060288 unmapped: 1040384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:53.088188+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66068480 unmapped: 1032192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:54.088614+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66068480 unmapped: 1032192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:55.089054+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66076672 unmapped: 1024000 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:56.089388+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66076672 unmapped: 1024000 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:57.089656+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:58.089843+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:59.090075+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:00.090302+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66093056 unmapped: 1007616 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:01.090652+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 991232 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:02.090870+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66117632 unmapped: 983040 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:03.091037+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66117632 unmapped: 983040 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:04.091510+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66117632 unmapped: 983040 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:05.091617+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:06.091736+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:07.091983+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:08.092139+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66134016 unmapped: 966656 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:09.092412+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66134016 unmapped: 966656 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:10.092737+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66142208 unmapped: 958464 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:11.093042+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66142208 unmapped: 958464 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:12.093284+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 950272 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:13.093460+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 950272 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:14.093788+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 950272 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:15.094007+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:16.094294+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:17.094717+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:18.095007+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66166784 unmapped: 933888 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:19.095212+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66166784 unmapped: 933888 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:20.095477+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66174976 unmapped: 925696 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:21.095657+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 909312 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:22.095879+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 909312 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:23.096163+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66199552 unmapped: 901120 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:24.096670+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66199552 unmapped: 901120 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:25.096846+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66207744 unmapped: 892928 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:26.097033+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66207744 unmapped: 892928 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:27.097217+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66207744 unmapped: 892928 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:28.097363+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:29.097644+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:30.097811+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66224128 unmapped: 876544 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:31.097972+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66224128 unmapped: 876544 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:32.098134+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66224128 unmapped: 876544 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:33.098272+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66232320 unmapped: 868352 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:34.098594+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66232320 unmapped: 868352 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:35.098770+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66240512 unmapped: 860160 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:36.098911+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66240512 unmapped: 860160 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:37.099102+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66240512 unmapped: 860160 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:38.099237+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66248704 unmapped: 851968 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:39.099386+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66248704 unmapped: 851968 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:40.099632+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66273280 unmapped: 827392 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:41.099770+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66273280 unmapped: 827392 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:42.099973+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66273280 unmapped: 827392 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:43.100147+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66273280 unmapped: 827392 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:44.100461+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66281472 unmapped: 819200 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:45.100625+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66281472 unmapped: 819200 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:46.100797+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:47.101024+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:48.101255+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:49.101396+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66297856 unmapped: 802816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:50.101618+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66297856 unmapped: 802816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:51.101772+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66306048 unmapped: 794624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:52.102533+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66314240 unmapped: 786432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:53.102635+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66314240 unmapped: 786432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:54.102843+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66322432 unmapped: 778240 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:55.103040+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66322432 unmapped: 778240 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:56.103248+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66330624 unmapped: 770048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:57.103509+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66330624 unmapped: 770048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:58.103734+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66330624 unmapped: 770048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:59.103909+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66338816 unmapped: 761856 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:00.104067+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66355200 unmapped: 745472 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:01.104247+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66363392 unmapped: 737280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:02.104501+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66363392 unmapped: 737280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:03.104656+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66363392 unmapped: 737280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:04.104815+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66371584 unmapped: 729088 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:05.104955+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66371584 unmapped: 729088 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:06.105097+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66379776 unmapped: 720896 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:07.105243+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66379776 unmapped: 720896 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:08.105375+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66379776 unmapped: 720896 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:09.105501+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:10.105633+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:11.105777+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:12.105900+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:13.106044+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:14.106214+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:15.106415+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66404352 unmapped: 696320 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:16.106620+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66404352 unmapped: 696320 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:17.106741+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66412544 unmapped: 688128 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:18.106901+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66412544 unmapped: 688128 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:19.107037+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66412544 unmapped: 688128 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:20.107236+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66420736 unmapped: 679936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:21.107375+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66420736 unmapped: 679936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:22.107650+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66428928 unmapped: 671744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:23.107903+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66428928 unmapped: 671744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:24.108112+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66428928 unmapped: 671744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:25.108263+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66437120 unmapped: 663552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:26.108419+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66437120 unmapped: 663552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:27.108614+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66445312 unmapped: 655360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:28.108790+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66445312 unmapped: 655360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:29.108973+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66445312 unmapped: 655360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:30.109150+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:31.109318+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:32.109445+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:33.109702+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:34.109963+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:35.110163+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66469888 unmapped: 630784 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:36.110363+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66469888 unmapped: 630784 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:37.110684+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:38.110861+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:39.110995+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:40.111155+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:41.111339+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66486272 unmapped: 614400 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:42.111643+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66486272 unmapped: 614400 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:43.111911+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66494464 unmapped: 606208 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:44.112164+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66494464 unmapped: 606208 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:45.112333+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66494464 unmapped: 606208 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:46.112688+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66502656 unmapped: 598016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:47.112970+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66502656 unmapped: 598016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:48.113208+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66502656 unmapped: 598016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:49.113422+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66510848 unmapped: 589824 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:50.113753+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66510848 unmapped: 589824 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:51.113999+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 581632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:52.114229+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 581632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:53.114533+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 581632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:54.114759+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66527232 unmapped: 573440 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:55.114966+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66527232 unmapped: 573440 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:56.115208+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66535424 unmapped: 565248 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:57.115402+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66535424 unmapped: 565248 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:58.115531+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:59.115715+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:00.115901+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:01.116108+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66551808 unmapped: 548864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:02.116282+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66551808 unmapped: 548864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:03.116449+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66551808 unmapped: 548864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:04.116668+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66560000 unmapped: 540672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:05.116820+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66560000 unmapped: 540672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:06.117085+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:07.117276+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:08.117489+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:09.117681+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66576384 unmapped: 524288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:10.117861+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66584576 unmapped: 516096 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:11.118047+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66592768 unmapped: 507904 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:12.118179+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66592768 unmapped: 507904 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:13.118325+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66592768 unmapped: 507904 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:14.118503+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66600960 unmapped: 499712 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:15.118704+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66600960 unmapped: 499712 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:16.118890+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66600960 unmapped: 499712 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:17.119041+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:18.119178+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:19.119325+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:20.119472+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:21.119708+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:22.119871+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:23.120108+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:24.120341+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66625536 unmapped: 475136 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:25.120629+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66625536 unmapped: 475136 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:26.120887+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:27.121207+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:28.121476+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:29.121651+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:30.121837+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:31.122010+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:32.122167+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:33.122298+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:34.122547+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:35.122670+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:36.122816+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:37.122984+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66666496 unmapped: 434176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:38.123121+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66666496 unmapped: 434176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:39.123341+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:40.123599+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:41.123863+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:42.124082+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:43.124363+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:44.124652+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:45.124875+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:46.125000+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:47.125182+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:48.125396+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:49.125604+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:50.125758+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:51.125959+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:52.126083+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:53.126158+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:54.126302+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:55.126468+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:56.126638+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:57.126810+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:58.126997+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:59.127161+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:00.127345+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66740224 unmapped: 360448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:01.127465+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66740224 unmapped: 360448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:02.127629+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66748416 unmapped: 352256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:03.127768+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66748416 unmapped: 352256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:04.127939+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66748416 unmapped: 352256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:05.128152+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:06.128387+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:07.128719+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:08.128901+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66764800 unmapped: 335872 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:09.129091+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66764800 unmapped: 335872 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:10.129251+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66772992 unmapped: 327680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:11.129403+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66772992 unmapped: 327680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:12.129558+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66772992 unmapped: 327680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:13.130078+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:14.130222+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:15.130375+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:16.130521+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:17.130618+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:18.130733+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66797568 unmapped: 303104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:19.130857+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66797568 unmapped: 303104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:20.131005+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66805760 unmapped: 294912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:21.131144+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66805760 unmapped: 294912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:22.131265+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66805760 unmapped: 294912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:23.131417+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:24.131633+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66813952 unmapped: 286720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:25.131812+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66813952 unmapped: 286720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:26.131978+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:27.132097+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:28.132249+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:29.132426+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:30.132608+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:31.132767+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:32.132968+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:33.133130+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:34.133520+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:35.133672+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:36.133813+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:37.134041+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66854912 unmapped: 245760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:38.134204+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66854912 unmapped: 245760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:39.134352+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66863104 unmapped: 237568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:40.134512+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66863104 unmapped: 237568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:41.134675+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66863104 unmapped: 237568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:42.134836+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:43.135034+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:44.135218+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:45.135451+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:46.135669+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:47.135823+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:48.136000+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:49.136205+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:50.136534+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:51.136703+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:52.136910+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66904064 unmapped: 196608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:53.137054+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:54.137376+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:55.137614+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:56.137803+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66920448 unmapped: 180224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:57.138045+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66920448 unmapped: 180224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:58.138327+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:59.138502+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:00.138608+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:01.138750+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:02.138937+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:03.139267+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66936832 unmapped: 163840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:04.139415+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66936832 unmapped: 163840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:05.139554+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66945024 unmapped: 155648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:06.140751+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66945024 unmapped: 155648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:07.140938+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:08.141226+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:09.141479+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:10.141686+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:11.141901+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:12.142102+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:13.142277+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:14.142512+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:15.142711+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 122880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:16.142878+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 122880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:17.143067+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:18.143230+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:19.143421+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:20.143598+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:21.143745+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:22.143885+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:23.144035+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67002368 unmapped: 98304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:24.144212+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67002368 unmapped: 98304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:25.144330+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67010560 unmapped: 90112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:26.144493+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67010560 unmapped: 90112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:27.144675+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67010560 unmapped: 90112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:28.144818+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67018752 unmapped: 81920 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:29.144957+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67018752 unmapped: 81920 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:30.145093+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67026944 unmapped: 73728 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:31.145234+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67026944 unmapped: 73728 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:32.145387+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67026944 unmapped: 73728 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:33.145547+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:34.145742+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:35.145876+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:36.146033+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67043328 unmapped: 57344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:37.146209+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:38.146368+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:39.146503+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 16.41 MB, 0.03 MB/s
                                           Interval WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:40.146654+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:41.146816+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:42.146998+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:43.147135+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:44.147312+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:45.147502+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:46.147668+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:47.147816+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:48.147980+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67125248 unmapped: 1024000 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:49.148156+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 1015808 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:50.148382+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 1015808 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:51.148614+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67141632 unmapped: 1007616 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:52.148801+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67141632 unmapped: 1007616 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:53.148957+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67149824 unmapped: 999424 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:54.149307+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67149824 unmapped: 999424 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:55.149552+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67149824 unmapped: 999424 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:56.149804+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:57.149975+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:58.150229+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:59.150448+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:00.150647+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:01.150802+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:02.150939+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:03.151097+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:04.151324+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:05.151457+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:06.151661+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:07.151809+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:08.152059+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:09.152332+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:10.152533+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:11.152703+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:12.152902+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:13.153122+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:14.153285+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:15.153427+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:16.153619+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67223552 unmapped: 925696 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:17.153754+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:18.153943+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:19.154122+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:20.154276+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67239936 unmapped: 909312 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:21.154441+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67239936 unmapped: 909312 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:22.154593+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:23.154740+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:24.154958+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:25.155115+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67256320 unmapped: 892928 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:26.155309+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67256320 unmapped: 892928 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:27.155442+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:28.155534+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:29.155639+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:30.155781+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:31.155902+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:32.156041+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:33.156182+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:34.156334+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:35.156472+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:36.156680+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67289088 unmapped: 860160 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:37.156761+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:38.156877+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:39.157110+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:40.157284+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:41.157468+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:42.157609+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:43.157790+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 835584 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:44.158059+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 835584 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:45.158487+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:46.158697+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:47.158906+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:48.159106+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67330048 unmapped: 819200 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:49.159286+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67330048 unmapped: 819200 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:50.159422+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67338240 unmapped: 811008 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:51.159554+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67338240 unmapped: 811008 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:52.159846+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67346432 unmapped: 802816 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:53.160034+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 794624 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:54.160279+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 794624 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:55.160437+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:56.160653+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:57.160805+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:58.160947+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:59.161126+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:00.161288+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:01.161462+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:02.161630+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:03.161806+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:04.162026+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67387392 unmapped: 761856 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:05.162212+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67387392 unmapped: 761856 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:06.162387+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:07.162616+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:08.162786+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:09.162926+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67403776 unmapped: 745472 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:10.163075+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67403776 unmapped: 745472 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:11.163207+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:12.163419+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:13.163605+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:14.163802+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:15.163936+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:16.164135+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:17.164317+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:18.164499+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:19.164693+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:20.164869+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:21.165071+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:22.165298+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:23.165480+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:24.165678+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:25.165822+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:26.165967+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:27.166139+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:28.166370+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:29.166504+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:30.166681+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:31.166807+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:32.166960+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67469312 unmapped: 679936 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:33.167252+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67469312 unmapped: 679936 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:34.167456+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67477504 unmapped: 671744 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:35.167639+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67477504 unmapped: 671744 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:36.167848+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:37.167994+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:38.168111+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:39.168280+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:40.168432+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:41.168640+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:42.168812+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:43.184841+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:44.185011+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:45.185157+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:46.185351+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:47.185638+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:48.185858+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:49.186075+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:50.186399+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:51.186534+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:52.186679+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:53.186830+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:54.187040+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:55.187180+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:56.187389+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:57.187526+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:58.187752+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:59.187998+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:00.188126+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:01.188861+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:02.189002+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:03.189146+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:04.189325+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:05.189473+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:06.189631+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:07.189769+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:08.189887+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:09.190193+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:10.190321+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:11.190465+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:12.190625+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:13.190808+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:14.191005+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:15.191339+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:16.191521+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:17.191657+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:18.191841+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:19.192074+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:20.192255+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:21.192461+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:22.192638+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:23.192821+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:24.192956+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:25.193164+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:26.193374+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:27.193639+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:28.193804+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:29.193960+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:30.194130+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:31.194330+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:32.194512+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:33.194673+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:34.194933+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:35.195089+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:36.195232+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:37.195375+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:38.195610+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:39.195790+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:40.195990+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:41.196164+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:42.196380+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:43.196647+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:44.196859+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:45.197013+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:46.197189+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:47.197359+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:48.197507+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:49.197753+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:50.197918+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:51.198066+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:52.198208+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:53.198373+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:54.198788+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:55.198941+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:56.199081+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:57.199233+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:58.199380+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:59.199526+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:00.199719+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:01.199901+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:02.200058+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:03.200235+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:04.200413+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:05.200636+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:06.200776+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:07.200929+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:08.201066+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:09.201219+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:10.201350+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:11.201494+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:12.201657+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:13.201798+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:14.202002+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:15.202163+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:16.202343+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:17.202467+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:18.202678+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:19.202839+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:20.203012+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:21.203168+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:22.203294+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:23.203479+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:24.203635+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:25.203796+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:26.204012+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:27.204201+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:28.204403+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:29.204612+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:30.204780+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:31.205157+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:32.205681+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:33.206078+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:34.206361+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:35.206502+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:36.207910+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:37.208060+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:38.208191+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:39.208513+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:40.208646+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:41.208812+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:42.208984+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:43.209388+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:44.209618+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:45.209787+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:46.209917+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:47.210259+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:48.210466+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:49.210637+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:50.210799+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:51.210982+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:52.211192+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:53.211378+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:54.211604+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:55.211786+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:56.212188+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:57.212327+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:58.212486+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:59.212662+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:00.212818+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:01.212994+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:02.213267+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:03.213421+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:04.213623+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:05.213799+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:06.214003+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:07.214165+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:08.214325+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:09.214664+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:10.214855+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:11.215066+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:12.215228+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:13.215372+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:14.215623+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:15.215778+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:16.215943+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:17.216272+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:18.216446+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:19.216685+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:20.216916+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:21.217088+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:22.217299+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:23.217545+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:24.217756+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:25.217937+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:26.218100+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:27.218302+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:28.218430+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:29.218559+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:30.218744+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:31.218888+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:32.219080+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:33.219264+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:34.219498+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:35.219663+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:36.219971+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:37.220147+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:38.220301+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:39.220505+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:40.220748+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:41.220922+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:42.221104+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:43.221270+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:44.221455+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:45.221616+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: mgrc ms_handle_reset ms_handle_reset con 0x56144a546000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1553116858
Dec 03 21:34:47 compute-0 ceph-osd[86059]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1553116858,v1:192.168.122.100:6801/1553116858]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: get_auth_request con 0x561449e0a800 auth_method 0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: mgrc handle_mgr_configure stats_period=5
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:46.221767+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:47.221909+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:48.222044+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:49.222255+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:50.222496+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 ms_handle_reset con 0x561449e0b400 session 0x56144a79a700
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e32c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:51.222665+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:52.222836+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:53.223041+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:54.223288+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:55.223473+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:56.223629+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:57.223754+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:58.223915+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:59.224091+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:00.224242+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:01.224397+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:02.224553+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:03.224716+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:04.224884+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:05.225021+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:06.225368+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:07.225512+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:08.225731+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:09.226147+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:10.226314+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:11.226483+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:12.226695+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:13.226984+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:14.227178+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:15.227429+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:16.227565+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:17.227737+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:18.227868+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:19.227983+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:20.228195+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:21.228379+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:22.228557+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:23.228982+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:24.229216+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:25.229435+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:26.229598+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:27.229699+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:28.229864+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:29.229994+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:30.230160+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:31.230340+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:32.230517+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:33.230752+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:34.230945+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:35.231120+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:36.231271+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:37.231502+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:38.231643+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:39.231855+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:40.232283+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:41.232528+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:42.232856+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:43.233076+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:44.233415+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:45.233618+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:46.233834+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:47.234049+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:48.234393+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:49.234616+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:50.235060+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:51.235271+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:52.235527+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:53.235875+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:54.236116+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:55.236255+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:56.236381+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:57.236494+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:58.236646+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:59.236782+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:00.236913+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:01.237101+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:02.237282+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:03.237450+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:04.237762+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:05.237915+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:06.238131+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:07.238319+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:08.238533+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:09.238692+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:10.238833+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:11.238988+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:12.239130+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:13.239280+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:14.239517+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:15.239692+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:16.239853+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:17.240016+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:18.240192+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:19.240336+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:20.240489+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:21.240677+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:22.240833+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:23.240987+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:24.241236+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:25.241453+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:26.241677+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:27.241822+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:28.241954+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:29.242126+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:30.242339+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:31.242509+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:32.242736+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:33.242871+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:34.243074+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:35.243240+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:36.243445+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:37.243631+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:38.243778+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:39.243952+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:40.244191+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:41.244394+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:42.244620+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:43.244807+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:44.244986+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:45.245192+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:46.245345+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:47.245608+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:48.245833+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:49.246130+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:50.246396+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:51.246687+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:52.246975+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:53.247236+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:54.247414+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:55.247541+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:56.247675+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:57.247807+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:58.247989+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:59.248112+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:00.248256+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:01.248439+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:02.248665+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:03.248789+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:04.248936+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:05.249137+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:06.249312+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:07.249481+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:08.249636+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:09.249809+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:10.250007+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:11.250142+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:12.250292+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:13.250478+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:14.250693+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:15.250880+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:16.251049+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:17.251190+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:18.251439+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:19.251599+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:20.251726+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:21.251854+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:22.252020+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:23.252197+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:24.252359+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:25.252489+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:26.252620+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:27.252772+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:28.252923+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:29.253064+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:30.253258+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:31.253417+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:32.253625+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:33.253781+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:34.254028+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:35.254195+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:36.254369+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:37.254632+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:38.254828+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:39.255056+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:40.255219+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:41.255346+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:42.257663+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:43.257832+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:44.258001+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:45.258230+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:46.258446+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:47.258628+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:48.258766+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:49.258955+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:50.259120+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:51.259345+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:52.259546+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:53.259791+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:54.259998+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:55.260188+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:56.260386+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:57.260606+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:58.260844+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:59.261033+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:00.261236+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:01.261398+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:02.261549+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:03.261745+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:04.261963+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:05.262187+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:06.262349+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:07.262560+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:08.262804+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:09.262988+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:10.263164+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:11.263420+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:12.263670+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:13.263917+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:14.264120+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:15.264257+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:16.264407+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:17.264556+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:18.264733+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:19.264958+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:20.265160+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:21.265391+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:22.265675+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:23.265946+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:24.271630+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:25.271915+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:26.272135+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:27.272330+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:28.272545+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:29.272711+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:30.272921+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:31.273257+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:32.273496+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:33.273677+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:34.273930+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:35.274110+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:36.274296+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:37.274557+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:38.274788+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:39.275015+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:40.275248+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:41.275516+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:42.275685+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:43.275817+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:44.276079+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:45.276307+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:46.276461+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:47.276603+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:48.276786+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:49.276974+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:50.277119+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:51.277336+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:52.277531+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:53.277669+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:54.277836+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:55.278012+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:56.278163+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:57.278316+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:58.278533+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:59.278754+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:00.278954+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:01.279121+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:02.279245+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:03.279424+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:04.279645+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:05.279857+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:06.280093+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:07.280329+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:08.280547+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:09.280713+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:10.280867+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:11.281131+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:12.281310+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:13.281488+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:14.281714+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:15.281901+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:16.282844+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:17.283285+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:18.283516+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:19.283678+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:20.283882+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:21.284050+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:22.284237+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:23.284430+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:24.284725+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:25.284899+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:26.285062+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:27.285214+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:28.285365+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:29.285550+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:30.285761+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:31.285936+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:32.286181+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:33.286345+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:34.286553+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:35.286740+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:36.286896+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:37.287087+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:38.287255+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:39.287452+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:40.287681+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:41.287860+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:42.288064+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:43.288204+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:44.288415+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:45.288546+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:46.288765+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:47.288912+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:48.289075+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:49.289237+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:50.289402+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:51.289614+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:52.289792+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:53.290010+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:54.290241+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:55.290450+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:56.291079+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:57.291411+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:58.291544+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:59.291851+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:00.292274+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:01.292724+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:02.293040+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:03.293544+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:04.294028+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:05.294272+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:06.294455+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:07.294789+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:08.295095+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:09.295411+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:10.295698+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:11.295983+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:12.296251+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:13.296506+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:14.296872+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:15.297088+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:16.297317+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:17.297532+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:18.297759+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:19.297937+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:20.298127+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:21.298300+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:22.298500+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:23.298656+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:24.298916+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:25.299183+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:26.299374+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:27.299561+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:28.299804+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:29.300274+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:30.300635+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:31.300927+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:32.301043+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:33.301180+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:34.301353+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:35.301655+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:36.301805+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:37.302043+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:38.302219+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:39.302504+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:40.302707+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:41.303011+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:42.303268+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:43.303400+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:44.303675+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:45.303927+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:46.304284+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:47.304554+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:48.304760+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:49.304990+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:50.305238+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:51.305418+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:52.305634+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:53.305845+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:54.306095+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:55.306311+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:56.306536+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:57.306782+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:58.307132+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:59.307391+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:00.307643+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:01.307872+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:02.308044+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:03.308185+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:04.308519+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:05.308762+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:06.308956+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:07.309393+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:08.309853+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:09.310265+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:10.310531+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:11.310835+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:12.311159+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:13.311435+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:14.311718+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:15.311957+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:16.312199+0000)
Dec 03 21:34:47 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14828 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:17.312421+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:18.312645+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:19.312789+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 155648 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 62 handle_osd_map epochs [63,63], i have 63, src has [1,63]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 1082.728759766s of 1082.736083984s, submitted: 4
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:20.312958+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 155648 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 63 handle_osd_map epochs [63,64], i have 63, src has [1,64]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:21.313090+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 9568256 heap: 77463552 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 64 ms_handle_reset con 0x561449e33000 session 0x56144ca58fc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:22.313289+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 9568256 heap: 77463552 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 64 heartbeat osd_stat(store_statfs(0x4fdca8000/0x0/0x4ffc00000, data 0x4ba415/0x522000, compress 0x0/0x0/0x0, omap 0xd454, meta 0x1a22bac), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 64 handle_osd_map epochs [65,65], i have 65, src has [1,65]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:23.313462+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 9551872 heap: 77463552 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:24.313665+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 9551872 heap: 77463552 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc69000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 65 ms_handle_reset con 0x56144cc69000 session 0x56144a2c8540
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 586543 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:25.313858+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 65 heartbeat osd_stat(store_statfs(0x4fd4a7000/0x0/0x4ffc00000, data 0xcbb9fe/0xd25000, compress 0x0/0x0/0x0, omap 0xd6ef, meta 0x1a22911), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 65 handle_osd_map epochs [66,66], i have 66, src has [1,66]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:26.314017+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:27.314212+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:28.314400+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:29.314640+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 590035 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:30.314838+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 66 heartbeat osd_stat(store_statfs(0x4fd4a2000/0x0/0x4ffc00000, data 0xcbcfe7/0xd28000, compress 0x0/0x0/0x0, omap 0xd98e, meta 0x1a22672), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:31.315080+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:32.315326+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:33.315685+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:34.315992+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 590035 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:35.316211+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.200830460s of 15.669299126s, submitted: 15
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:36.316419+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:37.316623+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:38.316773+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:39.316917+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:40.317075+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:41.317248+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:42.317442+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:43.317628+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:44.317904+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:45.318134+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:46.318435+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:47.318638+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:48.318951+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:49.319205+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:50.319407+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:51.319697+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:52.319872+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:53.320176+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:54.320384+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:55.320611+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:56.320796+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:57.320950+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:58.321164+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:59.321399+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:00.321712+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 17661952 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:01.322028+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 17661952 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:02.322143+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68329472 unmapped: 17530880 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.018529892s of 27.025806427s, submitted: 9
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 68 ms_handle_reset con 0x56144cc68c00 session 0x56144a2c8c40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:03.322304+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 68 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 17506304 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:04.322517+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 17506304 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 597319 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:05.322658+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68493312 unmapped: 17367040 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 68 heartbeat osd_stat(store_statfs(0x4fd49a000/0x0/0x4ffc00000, data 0xcbfa97/0xd30000, compress 0x0/0x0/0x0, omap 0xdab4, meta 0x1a2254c), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:06.322790+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 77086720 unmapped: 17170432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:07.322924+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 25452544 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 68 handle_osd_map epochs [68,69], i have 68, src has [1,69]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 69 ms_handle_reset con 0x56144cc66000 session 0x56144b4061c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:08.323088+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 25452544 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 69 handle_osd_map epochs [69,70], i have 69, src has [1,70]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 70 ms_handle_reset con 0x56144cc66400 session 0x56144a30efc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:09.323269+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 25493504 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 71 ms_handle_reset con 0x561449e33000 session 0x56144cab6000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 71 ms_handle_reset con 0x56144cc66000 session 0x56144ca01a40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:10.323528+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 617753 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 25411584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 71 ms_handle_reset con 0x56144cc68c00 session 0x56144c8aefc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 71 ms_handle_reset con 0x56144cc66800 session 0x56144c82c000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fd490000/0x0/0x4ffc00000, data 0xcc4073/0xd38000, compress 0x0/0x0/0x0, omap 0xd42d, meta 0x1a22bd3), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:11.323713+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 25255936 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144c86ec00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:12.323906+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fd490000/0x0/0x4ffc00000, data 0xcc4073/0xd38000, compress 0x0/0x0/0x0, omap 0xd42d, meta 0x1a22bd3), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69148672 unmapped: 25108480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 71 handle_osd_map epochs [71,72], i have 71, src has [1,72]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.791918755s of 10.061242104s, submitted: 64
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 72 ms_handle_reset con 0x56144c86ec00 session 0x56144a30fa40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:13.324101+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 72 ms_handle_reset con 0x561449e33000 session 0x56144a2c8000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 72 ms_handle_reset con 0x56144cc66800 session 0x56144b55c540
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 72 heartbeat osd_stat(store_statfs(0x4fd493000/0x0/0x4ffc00000, data 0xcc4096/0xd39000, compress 0x0/0x0/0x0, omap 0xd42d, meta 0x1a22bd3), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69017600 unmapped: 25239552 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:14.324298+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69025792 unmapped: 25231360 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 72 handle_osd_map epochs [73,73], i have 72, src has [1,73]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 73 ms_handle_reset con 0x56144cc68c00 session 0x56144ca008c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:15.324427+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 623846 data_alloc: 218103808 data_used: 658
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf5400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf5000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69378048 unmapped: 24879104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 74 ms_handle_reset con 0x56144dcf5000 session 0x56144a79b340
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 74 ms_handle_reset con 0x56144dcf5400 session 0x56144c82c8c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:16.324632+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 24764416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 75 heartbeat osd_stat(store_statfs(0x4fd482000/0x0/0x4ffc00000, data 0xcc939a/0xd43000, compress 0x0/0x0/0x0, omap 0xce07, meta 0x1a231f9), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:17.324861+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 24764416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:18.325079+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69525504 unmapped: 24731648 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:19.325267+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69525504 unmapped: 24731648 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:20.325454+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 630921 data_alloc: 218103808 data_used: 4719
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69533696 unmapped: 24723456 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 75 handle_osd_map epochs [75,76], i have 75, src has [1,76]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 76 ms_handle_reset con 0x561449e33000 session 0x56144c82d880
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 76 heartbeat osd_stat(store_statfs(0x4fd488000/0x0/0x4ffc00000, data 0xcc93aa/0xd44000, compress 0x0/0x0/0x0, omap 0xcfee, meta 0x1a23012), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:21.325632+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 24657920 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:22.325803+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 76 heartbeat osd_stat(store_statfs(0x4fd482000/0x0/0x4ffc00000, data 0xcca9b6/0xd48000, compress 0x0/0x0/0x0, omap 0xd1d8, meta 0x1a22e28), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 24657920 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.904905319s of 10.081530571s, submitted: 99
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 77 ms_handle_reset con 0x56144b455c00 session 0x56144b407500
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:23.325996+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69632000 unmapped: 24625152 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:24.326215+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69632000 unmapped: 24625152 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 78 handle_osd_map epochs [78,79], i have 78, src has [1,79]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 79 ms_handle_reset con 0x56144b455800 session 0x56144a8228c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:25.326341+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 651584 data_alloc: 218103808 data_used: 4719
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69697536 unmapped: 24559616 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 79 handle_osd_map epochs [79,80], i have 79, src has [1,80]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 80 ms_handle_reset con 0x56144b455400 session 0x56144a823c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 80 handle_osd_map epochs [80,81], i have 80, src has [1,81]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:26.326471+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69844992 unmapped: 24412160 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 81 handle_osd_map epochs [81,82], i have 81, src has [1,82]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 82 ms_handle_reset con 0x56144cc66000 session 0x56144a79b500
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:27.326643+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 24387584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:28.326785+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 82 heartbeat osd_stat(store_statfs(0x4fd46a000/0x0/0x4ffc00000, data 0xcd30b3/0xd60000, compress 0x0/0x0/0x0, omap 0x11814, meta 0x1a1e7ec), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 24387584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:29.326952+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 24387584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:30.327118+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 665045 data_alloc: 218103808 data_used: 4719
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69861376 unmapped: 24395776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 83 ms_handle_reset con 0x561449e33000 session 0x56144c82ca80
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:31.327252+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 23314432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 83 handle_osd_map epochs [83,84], i have 83, src has [1,84]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 84 ms_handle_reset con 0x56144b455400 session 0x56144a5e4700
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:32.327392+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 23232512 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.928540230s of 10.082557678s, submitted: 92
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 85 ms_handle_reset con 0x56144b455800 session 0x56144a5e4a80
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:33.327556+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 23085056 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 85 heartbeat osd_stat(store_statfs(0x4fd466000/0x0/0x4ffc00000, data 0xcd6e5d/0xd64000, compress 0x0/0x0/0x0, omap 0x10e07, meta 0x1a1f1f9), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:34.327822+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 86 ms_handle_reset con 0x56144b455c00 session 0x56144c8aee00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 23027712 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:35.327949+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 87 ms_handle_reset con 0x56144cc68c00 session 0x56144c82c380
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 675307 data_alloc: 218103808 data_used: 8780
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 23044096 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 88 ms_handle_reset con 0x561449e33000 session 0x56144a5e5dc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:36.328051+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:37.328237+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:38.328446+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 88 heartbeat osd_stat(store_statfs(0x4fc2c0000/0x0/0x4ffc00000, data 0xcdb06a/0xd6a000, compress 0x0/0x0/0x0, omap 0x10ad6, meta 0x2bbf52a), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:39.328692+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:40.328897+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 675328 data_alloc: 218103808 data_used: 8780
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:41.329087+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:42.329310+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 22839296 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:43.329442+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 22839296 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:44.329686+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 89 heartbeat osd_stat(store_statfs(0x4fc2bd000/0x0/0x4ffc00000, data 0xcdc536/0xd6d000, compress 0x0/0x0/0x0, omap 0x10c0d, meta 0x2bbf3f3), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 22839296 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:45.329880+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.114182472s of 12.301213264s, submitted: 110
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 679237 data_alloc: 218103808 data_used: 8780
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 22765568 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 90 ms_handle_reset con 0x56144b455c00 session 0x56144ca01880
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:46.330034+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:47.330234+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:48.330370+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fc2b9000/0x0/0x4ffc00000, data 0xcddb5e/0xd71000, compress 0x0/0x0/0x0, omap 0x10d90, meta 0x2bbf270), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:49.330536+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:50.330660+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 682673 data_alloc: 218103808 data_used: 8780
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:51.330815+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 22732800 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:52.330926+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 92 ms_handle_reset con 0x56144cc66800 session 0x56144a5e5500
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf5000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 92 ms_handle_reset con 0x56144dcf5000 session 0x56144a823c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 22716416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:53.331107+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 22716416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:54.331288+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 92 heartbeat osd_stat(store_statfs(0x4fc2b0000/0x0/0x4ffc00000, data 0xce075c/0xd78000, compress 0x0/0x0/0x0, omap 0x10f91, meta 0x2bbf06f), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:55.331465+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 690209 data_alloc: 218103808 data_used: 8780
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:56.331640+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:57.331810+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:58.332039+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:59.332220+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 92 heartbeat osd_stat(store_statfs(0x4fc2b0000/0x0/0x4ffc00000, data 0xce075c/0xd78000, compress 0x0/0x0/0x0, omap 0x10f91, meta 0x2bbf06f), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:00.332404+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 690209 data_alloc: 218103808 data_used: 8780
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 92 heartbeat osd_stat(store_statfs(0x4fc2b0000/0x0/0x4ffc00000, data 0xce075c/0xd78000, compress 0x0/0x0/0x0, omap 0x10f91, meta 0x2bbf06f), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf4c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 92 ms_handle_reset con 0x56144dcf4c00 session 0x56144a863340
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 22953984 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:01.332622+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.329170227s of 16.372501373s, submitted: 33
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 93 ms_handle_reset con 0x561449e33000 session 0x56144a822380
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 22953984 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:02.332848+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 93 ms_handle_reset con 0x56144cc72c00 session 0x56144a8228c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 22953984 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:03.333041+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 23142400 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 94 handle_osd_map epochs [94,95], i have 94, src has [1,95]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 95 ms_handle_reset con 0x56144cbc4400 session 0x56144a823a40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:04.333240+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449a3b400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144c86ec00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 21807104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:05.333483+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 701785 data_alloc: 218103808 data_used: 8796
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 21807104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:06.333646+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 95 heartbeat osd_stat(store_statfs(0x4fc2a6000/0x0/0x4ffc00000, data 0xce480a/0xd82000, compress 0x0/0x0/0x0, omap 0x11421, meta 0x2bbebdf), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 21807104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:07.333856+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72622080 unmapped: 21635072 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 96 ms_handle_reset con 0x56144cc79000 session 0x56144ca01500
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:08.334060+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144a818800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 21479424 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 96 heartbeat osd_stat(store_statfs(0x4fc2a3000/0x0/0x4ffc00000, data 0xce5e53/0xd87000, compress 0x0/0x0/0x0, omap 0x114cc, meta 0x2bbeb34), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 97 ms_handle_reset con 0x56144a818800 session 0x56144b406380
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:09.335330+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 97 heartbeat osd_stat(store_statfs(0x4fc2a0000/0x0/0x4ffc00000, data 0xce743c/0xd8a000, compress 0x0/0x0/0x0, omap 0x11577, meta 0x2bbea89), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 21463040 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 97 ms_handle_reset con 0x561449e33000 session 0x56144ca65a40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:10.335687+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 710022 data_alloc: 218103808 data_used: 8831
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 97 ms_handle_reset con 0x56144cbc4400 session 0x56144ca4ddc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 21348352 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 97 ms_handle_reset con 0x56144cc72c00 session 0x56144b691dc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:11.336370+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.220746994s of 10.319797516s, submitted: 71
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72941568 unmapped: 21315584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:12.336657+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 99 ms_handle_reset con 0x56144cc79000 session 0x56144a823180
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc69800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 20037632 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 99 heartbeat osd_stat(store_statfs(0x4fc29c000/0x0/0x4ffc00000, data 0xcea072/0xd8e000, compress 0x0/0x0/0x0, omap 0x112a8, meta 0x2bbed58), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc69800 session 0x56144a823dc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:13.336811+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 20299776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:14.337160+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 20299776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:15.337299+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 721044 data_alloc: 218103808 data_used: 8831
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 20299776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:16.337664+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x561449e33000 session 0x56144ca00e00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cbc4400 session 0x56144a5e48c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc72c00 session 0x56144a79b500
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 20283392 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 heartbeat osd_stat(store_statfs(0x4fc295000/0x0/0x4ffc00000, data 0xceba87/0xd93000, compress 0x0/0x0/0x0, omap 0x113fe, meta 0x2bbec02), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc79000 session 0x56144a862540
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:17.337824+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc69000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc69000 session 0x56144a5e5880
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x561449e33000 session 0x56144a822540
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cbc4400 session 0x56144b4061c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc72c00 session 0x56144a823340
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc79000 session 0x56144b407c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75317248 unmapped: 18939904 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc68c00 session 0x56144ca4cfc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:18.338159+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc68c00 session 0x56144ca4ce00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75448320 unmapped: 18808832 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x561449e33000 session 0x56144b6901c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:19.338483+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cbc4400 session 0x56144b691c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75505664 unmapped: 18751488 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:20.338750+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 724938 data_alloc: 218103808 data_used: 9359
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 18743296 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:21.338977+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 heartbeat osd_stat(store_statfs(0x4fc298000/0x0/0x4ffc00000, data 0xceba97/0xd94000, compress 0x0/0x0/0x0, omap 0x117c7, meta 0x2bbe839), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75522048 unmapped: 18735104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:22.339237+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 101 ms_handle_reset con 0x56144b455400 session 0x56144ca01a40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 18407424 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.013634682s of 11.103779793s, submitted: 68
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:23.339486+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 101 handle_osd_map epochs [101,102], i have 101, src has [1,102]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 102 ms_handle_reset con 0x56144b455800 session 0x56144b407dc0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 102 ms_handle_reset con 0x56144b455400 session 0x56144a2c8000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 102 ms_handle_reset con 0x561449e33000 session 0x56144b407180
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 102 ms_handle_reset con 0x56144cbc4400 session 0x56144ca71500
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 18440192 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:24.339752+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 103 ms_handle_reset con 0x56144cc68c00 session 0x56144a823880
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75980800 unmapped: 18276352 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:25.339986+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 104 ms_handle_reset con 0x56144b455c00 session 0x56144ca4cc40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 740651 data_alloc: 218103808 data_used: 9871
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75980800 unmapped: 18276352 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:26.340181+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75980800 unmapped: 18276352 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 104 heartbeat osd_stat(store_statfs(0x4fc286000/0x0/0x4ffc00000, data 0xcf1598/0xda2000, compress 0x0/0x0/0x0, omap 0x12132, meta 0x2bbdece), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:27.340343+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 104 ms_handle_reset con 0x561449e33000 session 0x56144ca59c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76005376 unmapped: 18251776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:28.340509+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 105 ms_handle_reset con 0x56144b455400 session 0x56144b407a40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 105 heartbeat osd_stat(store_statfs(0x4fc28b000/0x0/0x4ffc00000, data 0xcf1588/0xda1000, compress 0x0/0x0/0x0, omap 0x11fdc, meta 0x2bbe024), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76054528 unmapped: 18202624 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:29.340671+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 105 heartbeat osd_stat(store_statfs(0x4fc287000/0x0/0x4ffc00000, data 0xcf2783/0xda2000, compress 0x0/0x0/0x0, omap 0x11ddb, meta 0x2bbe225), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 105 ms_handle_reset con 0x56144cc72c00 session 0x56144ca8ce00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 105 ms_handle_reset con 0x56144cc79000 session 0x56144ca00c40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76054528 unmapped: 18202624 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:30.340823+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 105 handle_osd_map epochs [105,106], i have 105, src has [1,106]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 742408 data_alloc: 218103808 data_used: 14470
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 106 ms_handle_reset con 0x56144cbc4400 session 0x56144c8afa40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 106 handle_osd_map epochs [106,107], i have 106, src has [1,107]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:31.340982+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:32.341120+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 107 heartbeat osd_stat(store_statfs(0x4fc281000/0x0/0x4ffc00000, data 0xcf527c/0xda7000, compress 0x0/0x0/0x0, omap 0x126e7, meta 0x2bbd919), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:33.341391+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 107 ms_handle_reset con 0x561449a3b400 session 0x56144a5e4700
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 107 ms_handle_reset con 0x56144c86ec00 session 0x56144a822c40
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:34.341594+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.015930176s of 11.220238686s, submitted: 138
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 107 ms_handle_reset con 0x56144cbc4400 session 0x56144c82d6c0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 107 heartbeat osd_stat(store_statfs(0x4fc281000/0x0/0x4ffc00000, data 0xcf527c/0xda7000, compress 0x0/0x0/0x0, omap 0x126e7, meta 0x2bbd919), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66800
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:35.341709+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 107 ms_handle_reset con 0x56144cc66800 session 0x56144b691880
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf4c00
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 107 heartbeat osd_stat(store_statfs(0x4fc286000/0x0/0x4ffc00000, data 0xcf526c/0xda6000, compress 0x0/0x0/0x0, omap 0x12773, meta 0x2bbd88d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 743999 data_alloc: 218103808 data_used: 14454
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 108 ms_handle_reset con 0x56144dcf4c00 session 0x56144ca4d340
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 18546688 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 108 heartbeat osd_stat(store_statfs(0x4fc286000/0x0/0x4ffc00000, data 0xcf526c/0xda6000, compress 0x0/0x0/0x0, omap 0x12773, meta 0x2bbd88d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:36.341830+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 18546688 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:37.342071+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 108 heartbeat osd_stat(store_statfs(0x4fc284000/0x0/0x4ffc00000, data 0xcf68c5/0xda8000, compress 0x0/0x0/0x0, omap 0x12a4c, meta 0x2bbd5b4), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:38.342220+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:39.342398+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:40.342554+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 746645 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:41.342773+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:42.343031+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:43.343333+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 109 heartbeat osd_stat(store_statfs(0x4fc27f000/0x0/0x4ffc00000, data 0xcf7d91/0xdab000, compress 0x0/0x0/0x0, omap 0x12ab8, meta 0x2bbd548), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:44.343632+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:45.343849+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 109 heartbeat osd_stat(store_statfs(0x4fc27f000/0x0/0x4ffc00000, data 0xcf7d91/0xdab000, compress 0x0/0x0/0x0, omap 0x12ab8, meta 0x2bbd548), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 750139 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:46.344034+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:47.344307+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:48.344559+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.987587929s of 14.052734375s, submitted: 37
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:49.344811+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:50.345076+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:51.345279+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:52.345456+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:53.345694+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:54.345861+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:55.345985+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:56.346219+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:57.346435+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:58.346609+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:59.346808+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:00.346928+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:01.347120+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:02.347318+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:03.347711+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:04.348267+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:05.348450+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:06.348674+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:07.348862+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:08.349084+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:09.349311+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:10.349529+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:11.349719+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:12.349905+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:13.350106+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:14.350399+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:15.350598+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:16.350779+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:17.350906+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:18.351025+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:19.351143+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:20.351290+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:21.351481+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:22.351653+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:23.351813+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:24.352019+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:25.352212+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:26.352360+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:27.352634+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:28.352802+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:29.352954+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:30.353119+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:31.353297+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:32.353463+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:33.353688+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:34.353914+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:35.354061+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:36.354210+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:37.354357+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:38.354541+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:39.354682+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:40.354944+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:41.355128+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:42.355352+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:43.355534+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:44.355772+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:45.355937+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:46.356096+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:47.356299+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:48.356511+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:49.356671+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:50.356823+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread fragmentation_score=0.000120 took=0.000018s
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:51.356998+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:52.357165+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:53.357340+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:54.357590+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:55.357729+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:56.357901+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:57.358085+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:58.358215+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:59.358375+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:00.358492+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:01.358667+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:02.358804+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:03.358952+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:04.359104+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:05.359223+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:06.359334+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:07.359468+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:08.359589+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:09.359716+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:10.359843+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:11.359977+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:12.360118+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:13.360256+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:14.360418+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'config show' '{prefix=config show}'
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 17915904 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:15.360542+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76374016 unmapped: 17883136 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:34:47 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:34:47 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:16.360688+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 17965056 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:34:47 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:17.360811+0000)
Dec 03 21:34:47 compute-0 ceph-osd[86059]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:34:47 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:34:47 compute-0 ceph-mon[75204]: pgmap v893: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:47 compute-0 ceph-mon[75204]: from='client.14820 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:47 compute-0 ceph-mon[75204]: from='client.14822 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:47 compute-0 ceph-mon[75204]: from='client.14826 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:47 compute-0 ceph-mon[75204]: from='client.14824 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:48 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14832 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 03 21:34:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2722371012' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 03 21:34:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v894: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:48 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14836 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "versions"} v 0)
Dec 03 21:34:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4244599608' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 03 21:34:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:34:48.938 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:34:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:34:48.938 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:34:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:34:48.938 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:34:49 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14840 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:49 compute-0 ceph-mon[75204]: from='client.14828 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:49 compute-0 ceph-mon[75204]: from='client.14832 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:49 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2722371012' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 03 21:34:49 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4244599608' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 03 21:34:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 03 21:34:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2291713604' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 03 21:34:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 03 21:34:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3661342478' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 03 21:34:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 03 21:34:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 03 21:34:50 compute-0 ceph-mon[75204]: pgmap v894: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:50 compute-0 ceph-mon[75204]: from='client.14836 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:50 compute-0 ceph-mon[75204]: from='client.14840 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:34:50 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2291713604' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 03 21:34:50 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3661342478' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 03 21:34:50 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 03 21:34:50 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 03 21:34:50 compute-0 systemd[1]: Starting Hostname Service...
Dec 03 21:34:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 03 21:34:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 03 21:34:50 compute-0 systemd[1]: Started Hostname Service.
Dec 03 21:34:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v895: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 03 21:34:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3983630212' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 03 21:34:51 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14856 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:51 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 03 21:34:51 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 03 21:34:51 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3983630212' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 03 21:34:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 03 21:34:51 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1237479117' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 03 21:34:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:34:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:34:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:34:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:34:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:34:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:34:52 compute-0 ceph-mon[75204]: pgmap v895: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:52 compute-0 ceph-mon[75204]: from='client.14856 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:52 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1237479117' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 03 21:34:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df"} v 0)
Dec 03 21:34:52 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2732367683' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 03 21:34:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v896: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 03 21:34:52 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1050893136' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 03 21:34:53 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2732367683' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 03 21:34:53 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1050893136' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 03 21:34:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 03 21:34:53 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3557579394' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 03 21:34:53 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14866 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:54 compute-0 ceph-mon[75204]: pgmap v896: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:54 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3557579394' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 03 21:34:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec 03 21:34:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/985449314' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 03 21:34:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v897: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec 03 21:34:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/278720350' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 03 21:34:55 compute-0 ceph-mon[75204]: from='client.14866 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:55 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/985449314' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 03 21:34:55 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/278720350' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 03 21:34:55 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14872 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec 03 21:34:55 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2297876058' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 03 21:34:56 compute-0 podman[252316]: 2025-12-03 21:34:56.162498482 +0000 UTC m=+0.105698026 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:34:56 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14876 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:56 compute-0 ceph-mon[75204]: pgmap v897: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:56 compute-0 ceph-mon[75204]: from='client.14872 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:56 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2297876058' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 03 21:34:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:34:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v898: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:56 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14878 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd dump"} v 0)
Dec 03 21:34:57 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2265674782' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Dec 03 21:34:57 compute-0 ceph-mon[75204]: from='client.14876 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:57 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2265674782' entity='client.admin' cmd={"prefix": "osd dump"} : dispatch
Dec 03 21:34:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Dec 03 21:34:57 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/531983363' entity='client.admin' cmd={"prefix": "osd numa-status"} : dispatch
Dec 03 21:34:57 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14884 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:58 compute-0 ovs-appctl[253090]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 21:34:58 compute-0 ovs-appctl[253098]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 21:34:58 compute-0 ovs-appctl[253122]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14886 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:34:58 compute-0 ceph-mon[75204]: pgmap v898: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:58 compute-0 ceph-mon[75204]: from='client.14878 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:58 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/531983363' entity='client.admin' cmd={"prefix": "osd numa-status"} : dispatch
Dec 03 21:34:58 compute-0 ceph-mon[75204]: from='client.14884 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v899: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Dec 03 21:34:58 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2944769902' entity='client.admin' cmd={"prefix": "osd pool ls", "detail": "detail"} : dispatch
Dec 03 21:34:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd stat"} v 0)
Dec 03 21:34:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1521308190' entity='client.admin' cmd={"prefix": "osd stat"} : dispatch
Dec 03 21:34:59 compute-0 ceph-mon[75204]: from='client.14886 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:59 compute-0 ceph-mon[75204]: pgmap v899: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:34:59 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2944769902' entity='client.admin' cmd={"prefix": "osd pool ls", "detail": "detail"} : dispatch
Dec 03 21:34:59 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1521308190' entity='client.admin' cmd={"prefix": "osd stat"} : dispatch
Dec 03 21:34:59 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14892 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:34:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:34:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/825953710' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:34:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:34:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/825953710' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:35:00 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14896 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:35:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v900: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:00 compute-0 ceph-mon[75204]: from='client.14892 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:35:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/825953710' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:35:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/825953710' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:35:00 compute-0 ceph-mon[75204]: from='client.14896 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:35:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0)
Dec 03 21:35:00 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3248953361' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 03 21:35:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Dec 03 21:35:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3067969316' entity='client.admin' cmd={"prefix": "time-sync-status"} : dispatch
Dec 03 21:35:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:01 compute-0 ceph-mon[75204]: pgmap v900: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:01 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3248953361' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 03 21:35:01 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3067969316' entity='client.admin' cmd={"prefix": "time-sync-status"} : dispatch
Dec 03 21:35:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Dec 03 21:35:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3168231262' entity='client.admin' cmd={"prefix": "config dump", "format": "json-pretty"} : dispatch
Dec 03 21:35:02 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14906 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v901: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:02 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3168231262' entity='client.admin' cmd={"prefix": "config dump", "format": "json-pretty"} : dispatch
Dec 03 21:35:02 compute-0 ceph-mon[75204]: from='client.14906 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Dec 03 21:35:02 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4033394318' entity='client.admin' cmd={"prefix": "df", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 03 21:35:03 compute-0 ceph-mon[75204]: pgmap v901: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:03 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4033394318' entity='client.admin' cmd={"prefix": "df", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 03 21:35:03 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Dec 03 21:35:03 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2889035687' entity='client.admin' cmd={"prefix": "df", "format": "json-pretty"} : dispatch
Dec 03 21:35:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Dec 03 21:35:04 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2318033544' entity='client.admin' cmd={"prefix": "fs dump", "format": "json-pretty"} : dispatch
Dec 03 21:35:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v902: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:04 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2889035687' entity='client.admin' cmd={"prefix": "df", "format": "json-pretty"} : dispatch
Dec 03 21:35:04 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2318033544' entity='client.admin' cmd={"prefix": "fs dump", "format": "json-pretty"} : dispatch
Dec 03 21:35:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Dec 03 21:35:04 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3159154276' entity='client.admin' cmd={"prefix": "fs ls", "format": "json-pretty"} : dispatch
Dec 03 21:35:05 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14916 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:05 compute-0 ceph-mon[75204]: pgmap v902: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:05 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3159154276' entity='client.admin' cmd={"prefix": "fs ls", "format": "json-pretty"} : dispatch
Dec 03 21:35:05 compute-0 ceph-mon[75204]: from='client.14916 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Dec 03 21:35:05 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1832727319' entity='client.admin' cmd={"prefix": "mds stat", "format": "json-pretty"} : dispatch
Dec 03 21:35:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Dec 03 21:35:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3196854306' entity='client.admin' cmd={"prefix": "mon dump", "format": "json-pretty"} : dispatch
Dec 03 21:35:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v903: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1832727319' entity='client.admin' cmd={"prefix": "mds stat", "format": "json-pretty"} : dispatch
Dec 03 21:35:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3196854306' entity='client.admin' cmd={"prefix": "mon dump", "format": "json-pretty"} : dispatch
Dec 03 21:35:06 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14922 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Dec 03 21:35:07 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/643280158' entity='client.admin' cmd={"prefix": "osd blocklist ls", "format": "json-pretty"} : dispatch
Dec 03 21:35:07 compute-0 ceph-mon[75204]: pgmap v903: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:07 compute-0 ceph-mon[75204]: from='client.14922 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:07 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/643280158' entity='client.admin' cmd={"prefix": "osd blocklist ls", "format": "json-pretty"} : dispatch
Dec 03 21:35:07 compute-0 virtqemud[241184]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 21:35:07 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14926 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:08 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14928 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v904: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Dec 03 21:35:08 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3565018222' entity='client.admin' cmd={"prefix": "osd dump", "format": "json-pretty"} : dispatch
Dec 03 21:35:08 compute-0 ceph-mon[75204]: from='client.14926 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:08 compute-0 ceph-mon[75204]: from='client.14928 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:08 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3565018222' entity='client.admin' cmd={"prefix": "osd dump", "format": "json-pretty"} : dispatch
Dec 03 21:35:08 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Dec 03 21:35:08 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3689210388' entity='client.admin' cmd={"prefix": "osd numa-status", "format": "json-pretty"} : dispatch
Dec 03 21:35:09 compute-0 systemd[1]: Starting Time & Date Service...
Dec 03 21:35:09 compute-0 systemd[1]: Started Time & Date Service.
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14934 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:09 compute-0 ceph-mon[75204]: pgmap v904: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:09 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3689210388' entity='client.admin' cmd={"prefix": "osd numa-status", "format": "json-pretty"} : dispatch
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14936 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:09 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:35:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Dec 03 21:35:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/466475480' entity='client.admin' cmd={"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 03 21:35:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v905: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:10 compute-0 ceph-mon[75204]: from='client.14934 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:10 compute-0 ceph-mon[75204]: from='client.14936 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:10 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/466475480' entity='client.admin' cmd={"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 03 21:35:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Dec 03 21:35:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2754364413' entity='client.admin' cmd={"prefix": "osd stat", "format": "json-pretty"} : dispatch
Dec 03 21:35:11 compute-0 podman[255050]: 2025-12-03 21:35:11.134824155 +0000 UTC m=+0.078084822 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 03 21:35:11 compute-0 podman[255051]: 2025-12-03 21:35:11.149279619 +0000 UTC m=+0.081473474 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:35:11 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14942 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:11 compute-0 ceph-mon[75204]: pgmap v905: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:11 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2754364413' entity='client.admin' cmd={"prefix": "osd stat", "format": "json-pretty"} : dispatch
Dec 03 21:35:11 compute-0 ceph-mon[75204]: from='client.14942 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:11 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.14944 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec 03 21:35:12 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2578987565' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:35:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v906: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:12 compute-0 ceph-mon[75204]: from='client.14944 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:35:12 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2578987565' entity='client.admin' cmd={"prefix": "status", "format": "json-pretty"} : dispatch
Dec 03 21:35:12 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Dec 03 21:35:12 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2903488151' entity='client.admin' cmd={"prefix": "time-sync-status", "format": "json-pretty"} : dispatch
Dec 03 21:35:13 compute-0 ceph-mon[75204]: pgmap v906: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:13 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2903488151' entity='client.admin' cmd={"prefix": "time-sync-status", "format": "json-pretty"} : dispatch
Dec 03 21:35:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v907: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:15 compute-0 ceph-mon[75204]: pgmap v907: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v908: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:17 compute-0 ceph-mon[75204]: pgmap v908: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v909: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:19 compute-0 nova_compute[241566]: 2025-12-03 21:35:19.546 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:19 compute-0 ceph-mon[75204]: pgmap v909: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v910: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:35:21
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['vms', 'volumes', '.mgr', 'backups', 'cephfs.cephfs.meta', 'images', 'cephfs.cephfs.data']
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:35:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:21 compute-0 nova_compute[241566]: 2025-12-03 21:35:21.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:21 compute-0 nova_compute[241566]: 2025-12-03 21:35:21.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:35:21 compute-0 ceph-mon[75204]: pgmap v910: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:35:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:35:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v911: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:22 compute-0 nova_compute[241566]: 2025-12-03 21:35:22.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:22 compute-0 nova_compute[241566]: 2025-12-03 21:35:22.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:35:22 compute-0 nova_compute[241566]: 2025-12-03 21:35:22.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:35:22 compute-0 nova_compute[241566]: 2025-12-03 21:35:22.575 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:35:22 compute-0 nova_compute[241566]: 2025-12-03 21:35:22.575 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:22 compute-0 nova_compute[241566]: 2025-12-03 21:35:22.575 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:22 compute-0 nova_compute[241566]: 2025-12-03 21:35:22.575 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:35:23 compute-0 ceph-mon[75204]: pgmap v911: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v912: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:24 compute-0 nova_compute[241566]: 2025-12-03 21:35:24.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:24 compute-0 nova_compute[241566]: 2025-12-03 21:35:24.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:24 compute-0 nova_compute[241566]: 2025-12-03 21:35:24.578 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:35:24 compute-0 nova_compute[241566]: 2025-12-03 21:35:24.578 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:35:24 compute-0 nova_compute[241566]: 2025-12-03 21:35:24.578 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:35:24 compute-0 nova_compute[241566]: 2025-12-03 21:35:24.579 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:35:24 compute-0 nova_compute[241566]: 2025-12-03 21:35:24.579 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:35:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:35:25 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/800307686' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:35:25 compute-0 nova_compute[241566]: 2025-12-03 21:35:25.106 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:35:25 compute-0 nova_compute[241566]: 2025-12-03 21:35:25.336 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:35:25 compute-0 nova_compute[241566]: 2025-12-03 21:35:25.338 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4960MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:35:25 compute-0 nova_compute[241566]: 2025-12-03 21:35:25.339 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:35:25 compute-0 nova_compute[241566]: 2025-12-03 21:35:25.340 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:35:25 compute-0 nova_compute[241566]: 2025-12-03 21:35:25.436 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:35:25 compute-0 nova_compute[241566]: 2025-12-03 21:35:25.437 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:35:25 compute-0 nova_compute[241566]: 2025-12-03 21:35:25.466 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:35:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v913: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:35:26 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2105869938' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:35:26 compute-0 ceph-mon[75204]: pgmap v912: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:26 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/800307686' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:35:26 compute-0 nova_compute[241566]: 2025-12-03 21:35:26.781 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.315s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:35:26 compute-0 nova_compute[241566]: 2025-12-03 21:35:26.786 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:35:26 compute-0 nova_compute[241566]: 2025-12-03 21:35:26.809 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:35:26 compute-0 nova_compute[241566]: 2025-12-03 21:35:26.812 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:35:26 compute-0 nova_compute[241566]: 2025-12-03 21:35:26.813 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:35:27 compute-0 podman[255208]: 2025-12-03 21:35:27.204005453 +0000 UTC m=+0.138390908 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:35:27 compute-0 ceph-mon[75204]: pgmap v913: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:27 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2105869938' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:35:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:35:27 compute-0 nova_compute[241566]: 2025-12-03 21:35:27.815 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v914: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:29 compute-0 nova_compute[241566]: 2025-12-03 21:35:29.546 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:35:29 compute-0 ceph-mon[75204]: pgmap v914: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v915: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:31 compute-0 ceph-mon[75204]: pgmap v915: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v916: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:33 compute-0 ceph-mon[75204]: pgmap v916: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:34 compute-0 sudo[247927]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:34 compute-0 sshd-session[247926]: Received disconnect from 192.168.122.10 port 34784:11: disconnected by user
Dec 03 21:35:34 compute-0 sshd-session[247926]: Disconnected from user zuul 192.168.122.10 port 34784
Dec 03 21:35:34 compute-0 sshd-session[247923]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:35:34 compute-0 systemd[1]: session-51.scope: Deactivated successfully.
Dec 03 21:35:34 compute-0 systemd[1]: session-51.scope: Consumed 2min 42.156s CPU time, 739.5M memory peak, read 320.7M from disk, written 64.3M to disk.
Dec 03 21:35:34 compute-0 systemd-logind[787]: Session 51 logged out. Waiting for processes to exit.
Dec 03 21:35:34 compute-0 systemd-logind[787]: Removed session 51.
Dec 03 21:35:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v917: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:34 compute-0 sshd-session[255234]: Accepted publickey for zuul from 192.168.122.10 port 54676 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:35:34 compute-0 systemd-logind[787]: New session 52 of user zuul.
Dec 03 21:35:34 compute-0 systemd[1]: Started Session 52 of User zuul.
Dec 03 21:35:34 compute-0 sshd-session[255234]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:35:34 compute-0 sudo[255237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:35:34 compute-0 sudo[255237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:34 compute-0 sudo[255237]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:34 compute-0 sudo[255243]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-12-03-bkaxeli.tar.xz
Dec 03 21:35:34 compute-0 sudo[255243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:35:34 compute-0 sudo[255243]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:34 compute-0 sshd-session[255238]: Received disconnect from 192.168.122.10 port 54676:11: disconnected by user
Dec 03 21:35:34 compute-0 sshd-session[255238]: Disconnected from user zuul 192.168.122.10 port 54676
Dec 03 21:35:34 compute-0 sshd-session[255234]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:35:34 compute-0 systemd[1]: session-52.scope: Deactivated successfully.
Dec 03 21:35:34 compute-0 systemd-logind[787]: Session 52 logged out. Waiting for processes to exit.
Dec 03 21:35:34 compute-0 sudo[255288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:35:34 compute-0 systemd-logind[787]: Removed session 52.
Dec 03 21:35:34 compute-0 sudo[255288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:34 compute-0 sshd-session[255313]: Accepted publickey for zuul from 192.168.122.10 port 54682 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:35:34 compute-0 systemd-logind[787]: New session 53 of user zuul.
Dec 03 21:35:34 compute-0 systemd[1]: Started Session 53 of User zuul.
Dec 03 21:35:34 compute-0 sshd-session[255313]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:35:35 compute-0 sudo[255319]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 03 21:35:35 compute-0 sudo[255319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:35:35 compute-0 sudo[255319]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:35 compute-0 sshd-session[255316]: Received disconnect from 192.168.122.10 port 54682:11: disconnected by user
Dec 03 21:35:35 compute-0 sshd-session[255316]: Disconnected from user zuul 192.168.122.10 port 54682
Dec 03 21:35:35 compute-0 sshd-session[255313]: pam_unix(sshd:session): session closed for user zuul
Dec 03 21:35:35 compute-0 systemd[1]: session-53.scope: Deactivated successfully.
Dec 03 21:35:35 compute-0 systemd-logind[787]: Session 53 logged out. Waiting for processes to exit.
Dec 03 21:35:35 compute-0 systemd-logind[787]: Removed session 53.
Dec 03 21:35:35 compute-0 sudo[255288]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:35:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:35:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:35:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:35:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:35:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:35:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:35:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:35:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:35:35 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:35:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:35:35 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:35:35 compute-0 sudo[255372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:35:35 compute-0 sudo[255372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:35 compute-0 sudo[255372]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:35 compute-0 sudo[255397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:35:35 compute-0 sudo[255397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:35 compute-0 podman[255434]: 2025-12-03 21:35:35.989906976 +0000 UTC m=+0.074734170 container create 13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_beaver, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:35:35 compute-0 ceph-mon[75204]: pgmap v917: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:35:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:35:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:35:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:35:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:35:35 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:35:36 compute-0 systemd[1]: Started libpod-conmon-13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363.scope.
Dec 03 21:35:36 compute-0 podman[255434]: 2025-12-03 21:35:35.956611397 +0000 UTC m=+0.041438681 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:35:36 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:35:36 compute-0 podman[255434]: 2025-12-03 21:35:36.075216545 +0000 UTC m=+0.160043819 container init 13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_beaver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:35:36 compute-0 podman[255434]: 2025-12-03 21:35:36.086963275 +0000 UTC m=+0.171790489 container start 13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_beaver, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:35:36 compute-0 podman[255434]: 2025-12-03 21:35:36.091273412 +0000 UTC m=+0.176100686 container attach 13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_beaver, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:35:36 compute-0 relaxed_beaver[255451]: 167 167
Dec 03 21:35:36 compute-0 systemd[1]: libpod-13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363.scope: Deactivated successfully.
Dec 03 21:35:36 compute-0 podman[255434]: 2025-12-03 21:35:36.093656807 +0000 UTC m=+0.178484021 container died 13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_beaver, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20251030, OSD_FLAVOR=default)
Dec 03 21:35:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-ffb4335efbecfb9547f3e43c4e8aac8fd4a5a816b8f1b33f709e3060a53422a8-merged.mount: Deactivated successfully.
Dec 03 21:35:36 compute-0 podman[255434]: 2025-12-03 21:35:36.151875356 +0000 UTC m=+0.236702550 container remove 13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=relaxed_beaver, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:35:36 compute-0 systemd[1]: libpod-conmon-13e5e941c751cc7f987e2755dd348d4e3c2a3963e18723d6ab3189089a75b363.scope: Deactivated successfully.
Dec 03 21:35:36 compute-0 podman[255475]: 2025-12-03 21:35:36.364769067 +0000 UTC m=+0.047290332 container create af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_ganguly, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:35:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v918: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:36 compute-0 systemd[1]: Started libpod-conmon-af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef.scope.
Dec 03 21:35:36 compute-0 podman[255475]: 2025-12-03 21:35:36.345870051 +0000 UTC m=+0.028391296 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:35:36 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:35:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2745bbe77b0ecb9e75d5cf6ddfdab508f33a12cf351b0f8e4f9ecd9dcc3f25c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2745bbe77b0ecb9e75d5cf6ddfdab508f33a12cf351b0f8e4f9ecd9dcc3f25c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2745bbe77b0ecb9e75d5cf6ddfdab508f33a12cf351b0f8e4f9ecd9dcc3f25c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2745bbe77b0ecb9e75d5cf6ddfdab508f33a12cf351b0f8e4f9ecd9dcc3f25c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2745bbe77b0ecb9e75d5cf6ddfdab508f33a12cf351b0f8e4f9ecd9dcc3f25c/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:36 compute-0 podman[255475]: 2025-12-03 21:35:36.478037398 +0000 UTC m=+0.160558713 container init af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_ganguly, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:35:36 compute-0 podman[255475]: 2025-12-03 21:35:36.495160946 +0000 UTC m=+0.177682171 container start af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_ganguly, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 03 21:35:36 compute-0 podman[255475]: 2025-12-03 21:35:36.498561538 +0000 UTC m=+0.181082853 container attach af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_ganguly, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:35:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:37 compute-0 nostalgic_ganguly[255491]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:35:37 compute-0 nostalgic_ganguly[255491]: --> All data devices are unavailable
Dec 03 21:35:37 compute-0 systemd[1]: libpod-af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef.scope: Deactivated successfully.
Dec 03 21:35:37 compute-0 podman[255512]: 2025-12-03 21:35:37.166333403 +0000 UTC m=+0.039535430 container died af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_ganguly, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:35:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-c2745bbe77b0ecb9e75d5cf6ddfdab508f33a12cf351b0f8e4f9ecd9dcc3f25c-merged.mount: Deactivated successfully.
Dec 03 21:35:37 compute-0 podman[255512]: 2025-12-03 21:35:37.209777449 +0000 UTC m=+0.082979456 container remove af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nostalgic_ganguly, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:35:37 compute-0 systemd[1]: libpod-conmon-af2867b85e8fc2146ffc9a63cd8942a73184a4725968475fac9aa78b5d69c8ef.scope: Deactivated successfully.
Dec 03 21:35:37 compute-0 sudo[255397]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:37 compute-0 sudo[255527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:35:37 compute-0 sudo[255527]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:37 compute-0 sudo[255527]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:37 compute-0 sudo[255552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:35:37 compute-0 sudo[255552]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:37 compute-0 podman[255591]: 2025-12-03 21:35:37.731487657 +0000 UTC m=+0.065823288 container create f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_northcutt, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:35:37 compute-0 systemd[1]: Started libpod-conmon-f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad.scope.
Dec 03 21:35:37 compute-0 podman[255591]: 2025-12-03 21:35:37.701021386 +0000 UTC m=+0.035357077 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:35:37 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:35:37 compute-0 podman[255591]: 2025-12-03 21:35:37.833067249 +0000 UTC m=+0.167402920 container init f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_northcutt, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:35:37 compute-0 podman[255591]: 2025-12-03 21:35:37.843459183 +0000 UTC m=+0.177794814 container start f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_northcutt, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:35:37 compute-0 podman[255591]: 2025-12-03 21:35:37.847608666 +0000 UTC m=+0.181944317 container attach f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_northcutt, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:35:37 compute-0 vigilant_northcutt[255607]: 167 167
Dec 03 21:35:37 compute-0 systemd[1]: libpod-f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad.scope: Deactivated successfully.
Dec 03 21:35:37 compute-0 podman[255591]: 2025-12-03 21:35:37.853177248 +0000 UTC m=+0.187512879 container died f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_northcutt, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:35:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-18e9470347cf0f550a94eeda2e995c4e39722cf72aef7fc73e8a176fedf404db-merged.mount: Deactivated successfully.
Dec 03 21:35:37 compute-0 podman[255591]: 2025-12-03 21:35:37.893039417 +0000 UTC m=+0.227375038 container remove f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_northcutt, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:35:37 compute-0 systemd[1]: libpod-conmon-f1ee53eefc931b7d14880bc6e9e8c53e93b5df92a8ebabc24d3f588d661cfaad.scope: Deactivated successfully.
Dec 03 21:35:38 compute-0 ceph-mon[75204]: pgmap v918: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:38 compute-0 podman[255630]: 2025-12-03 21:35:38.147181772 +0000 UTC m=+0.062137377 container create 8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_darwin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:35:38 compute-0 systemd[1]: Started libpod-conmon-8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36.scope.
Dec 03 21:35:38 compute-0 podman[255630]: 2025-12-03 21:35:38.126028265 +0000 UTC m=+0.040983870 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:35:38 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:35:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/427322c3a0e1df1d8576f2b4fb61002a166d7827839a2727ded5b21ca456528a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/427322c3a0e1df1d8576f2b4fb61002a166d7827839a2727ded5b21ca456528a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/427322c3a0e1df1d8576f2b4fb61002a166d7827839a2727ded5b21ca456528a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/427322c3a0e1df1d8576f2b4fb61002a166d7827839a2727ded5b21ca456528a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:38 compute-0 podman[255630]: 2025-12-03 21:35:38.260820574 +0000 UTC m=+0.175776189 container init 8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_darwin, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:35:38 compute-0 podman[255630]: 2025-12-03 21:35:38.271553157 +0000 UTC m=+0.186508772 container start 8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_darwin, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:35:38 compute-0 podman[255630]: 2025-12-03 21:35:38.275662198 +0000 UTC m=+0.190617863 container attach 8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_darwin, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:35:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v919: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:38 compute-0 musing_darwin[255646]: {
Dec 03 21:35:38 compute-0 musing_darwin[255646]:     "0": [
Dec 03 21:35:38 compute-0 musing_darwin[255646]:         {
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "devices": [
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "/dev/loop3"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             ],
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_name": "ceph_lv0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_size": "21470642176",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "name": "ceph_lv0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "tags": {
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cluster_name": "ceph",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.crush_device_class": "",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.encrypted": "0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.objectstore": "bluestore",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osd_id": "0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.type": "block",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.vdo": "0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.with_tpm": "0"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             },
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "type": "block",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "vg_name": "ceph_vg0"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:         }
Dec 03 21:35:38 compute-0 musing_darwin[255646]:     ],
Dec 03 21:35:38 compute-0 musing_darwin[255646]:     "1": [
Dec 03 21:35:38 compute-0 musing_darwin[255646]:         {
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "devices": [
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "/dev/loop4"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             ],
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_name": "ceph_lv1",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_size": "21470642176",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "name": "ceph_lv1",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "tags": {
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cluster_name": "ceph",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.crush_device_class": "",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.encrypted": "0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.objectstore": "bluestore",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osd_id": "1",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.type": "block",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.vdo": "0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.with_tpm": "0"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             },
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "type": "block",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "vg_name": "ceph_vg1"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:         }
Dec 03 21:35:38 compute-0 musing_darwin[255646]:     ],
Dec 03 21:35:38 compute-0 musing_darwin[255646]:     "2": [
Dec 03 21:35:38 compute-0 musing_darwin[255646]:         {
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "devices": [
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "/dev/loop5"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             ],
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_name": "ceph_lv2",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_size": "21470642176",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "name": "ceph_lv2",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "tags": {
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.cluster_name": "ceph",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.crush_device_class": "",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.encrypted": "0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.objectstore": "bluestore",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osd_id": "2",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.type": "block",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.vdo": "0",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:                 "ceph.with_tpm": "0"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             },
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "type": "block",
Dec 03 21:35:38 compute-0 musing_darwin[255646]:             "vg_name": "ceph_vg2"
Dec 03 21:35:38 compute-0 musing_darwin[255646]:         }
Dec 03 21:35:38 compute-0 musing_darwin[255646]:     ]
Dec 03 21:35:38 compute-0 musing_darwin[255646]: }
Dec 03 21:35:38 compute-0 systemd[1]: libpod-8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36.scope: Deactivated successfully.
Dec 03 21:35:38 compute-0 podman[255630]: 2025-12-03 21:35:38.587485688 +0000 UTC m=+0.502441313 container died 8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_darwin, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:35:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-427322c3a0e1df1d8576f2b4fb61002a166d7827839a2727ded5b21ca456528a-merged.mount: Deactivated successfully.
Dec 03 21:35:38 compute-0 podman[255630]: 2025-12-03 21:35:38.647798305 +0000 UTC m=+0.562753870 container remove 8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=musing_darwin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:35:38 compute-0 systemd[1]: libpod-conmon-8a2c275dbece0fcb1cb71c12aaaedf0fea1d416e48022ed8ae22adb55453ab36.scope: Deactivated successfully.
Dec 03 21:35:38 compute-0 sudo[255552]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:38 compute-0 sudo[255666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:35:38 compute-0 sudo[255666]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:38 compute-0 sudo[255666]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:38 compute-0 sudo[255691]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:35:38 compute-0 sudo[255691]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:39 compute-0 podman[255729]: 2025-12-03 21:35:39.214957113 +0000 UTC m=+0.071191123 container create e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=inspiring_chandrasekhar, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0)
Dec 03 21:35:39 compute-0 systemd[1]: Started libpod-conmon-e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6.scope.
Dec 03 21:35:39 compute-0 podman[255729]: 2025-12-03 21:35:39.19024236 +0000 UTC m=+0.046476370 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:35:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:35:39 compute-0 podman[255729]: 2025-12-03 21:35:39.308521497 +0000 UTC m=+0.164755477 container init e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=inspiring_chandrasekhar, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:35:39 compute-0 podman[255729]: 2025-12-03 21:35:39.318639553 +0000 UTC m=+0.174873533 container start e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=inspiring_chandrasekhar, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:35:39 compute-0 podman[255729]: 2025-12-03 21:35:39.321776299 +0000 UTC m=+0.178010279 container attach e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=inspiring_chandrasekhar, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030)
Dec 03 21:35:39 compute-0 inspiring_chandrasekhar[255745]: 167 167
Dec 03 21:35:39 compute-0 systemd[1]: libpod-e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6.scope: Deactivated successfully.
Dec 03 21:35:39 compute-0 podman[255729]: 2025-12-03 21:35:39.324927955 +0000 UTC m=+0.181161965 container died e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=inspiring_chandrasekhar, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:35:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd61805cb09eaa4e53b6caf4816db2e8fabc5b0e0fc93ac7c01cf89c4f580ce9-merged.mount: Deactivated successfully.
Dec 03 21:35:39 compute-0 podman[255729]: 2025-12-03 21:35:39.369933283 +0000 UTC m=+0.226167263 container remove e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=inspiring_chandrasekhar, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:35:39 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 03 21:35:39 compute-0 systemd[1]: libpod-conmon-e0185218b5fac23e84c8667236d2523e9c4bc4953499d1f6e95bec51034536a6.scope: Deactivated successfully.
Dec 03 21:35:39 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 03 21:35:39 compute-0 podman[255773]: 2025-12-03 21:35:39.590229505 +0000 UTC m=+0.059305789 container create 09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_kepler, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:35:39 compute-0 systemd[1]: Started libpod-conmon-09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b.scope.
Dec 03 21:35:39 compute-0 podman[255773]: 2025-12-03 21:35:39.564322608 +0000 UTC m=+0.033398992 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:35:39 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:35:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9212ea9be43dca0469066ddcd8b277271bc0235a3647359d507ca24d451c4325/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9212ea9be43dca0469066ddcd8b277271bc0235a3647359d507ca24d451c4325/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9212ea9be43dca0469066ddcd8b277271bc0235a3647359d507ca24d451c4325/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9212ea9be43dca0469066ddcd8b277271bc0235a3647359d507ca24d451c4325/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:35:39 compute-0 podman[255773]: 2025-12-03 21:35:39.702818059 +0000 UTC m=+0.171894393 container init 09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_kepler, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:35:39 compute-0 podman[255773]: 2025-12-03 21:35:39.714793135 +0000 UTC m=+0.183869449 container start 09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_kepler, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:35:39 compute-0 podman[255773]: 2025-12-03 21:35:39.71933355 +0000 UTC m=+0.188409874 container attach 09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_kepler, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, ceph=True)
Dec 03 21:35:40 compute-0 ceph-mon[75204]: pgmap v919: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v920: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:40 compute-0 lvm[255866]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:35:40 compute-0 lvm[255866]: VG ceph_vg0 finished
Dec 03 21:35:40 compute-0 lvm[255868]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:35:40 compute-0 lvm[255868]: VG ceph_vg1 finished
Dec 03 21:35:40 compute-0 lvm[255870]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:35:40 compute-0 lvm[255870]: VG ceph_vg2 finished
Dec 03 21:35:40 compute-0 practical_kepler[255789]: {}
Dec 03 21:35:40 compute-0 systemd[1]: libpod-09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b.scope: Deactivated successfully.
Dec 03 21:35:40 compute-0 systemd[1]: libpod-09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b.scope: Consumed 1.427s CPU time.
Dec 03 21:35:40 compute-0 podman[255773]: 2025-12-03 21:35:40.648880078 +0000 UTC m=+1.117956362 container died 09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_kepler, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:35:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-9212ea9be43dca0469066ddcd8b277271bc0235a3647359d507ca24d451c4325-merged.mount: Deactivated successfully.
Dec 03 21:35:40 compute-0 podman[255773]: 2025-12-03 21:35:40.700669982 +0000 UTC m=+1.169746306 container remove 09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=practical_kepler, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:35:40 compute-0 systemd[1]: libpod-conmon-09484c72a030e6f5f11609ea508c985712460f7fecd38681827dae784b671b6b.scope: Deactivated successfully.
Dec 03 21:35:40 compute-0 sudo[255691]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:35:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:35:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:35:40 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:35:40 compute-0 sudo[255885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:35:40 compute-0 sudo[255885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:35:40 compute-0 sudo[255885]: pam_unix(sudo:session): session closed for user root
Dec 03 21:35:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:41 compute-0 ceph-mon[75204]: pgmap v920: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:35:41 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:35:42 compute-0 podman[255911]: 2025-12-03 21:35:42.122407883 +0000 UTC m=+0.063432182 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 03 21:35:42 compute-0 podman[255910]: 2025-12-03 21:35:42.122899707 +0000 UTC m=+0.065186971 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 03 21:35:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v921: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:43 compute-0 ceph-mon[75204]: pgmap v921: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v922: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:45 compute-0 ceph-mon[75204]: pgmap v922: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v923: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:47 compute-0 ceph-mon[75204]: pgmap v923: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v924: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:35:48.939 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:35:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:35:48.940 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:35:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:35:48.940 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:35:49 compute-0 ceph-mon[75204]: pgmap v924: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v925: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #45. Immutable memtables: 0.
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.773177) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 45
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797751773252, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1137, "num_deletes": 256, "total_data_size": 992984, "memory_usage": 1014440, "flush_reason": "Manual Compaction"}
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #46: started
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797751782659, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 46, "file_size": 968645, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18214, "largest_seqno": 19350, "table_properties": {"data_size": 963171, "index_size": 2742, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13252, "raw_average_key_size": 19, "raw_value_size": 951452, "raw_average_value_size": 1430, "num_data_blocks": 124, "num_entries": 665, "num_filter_entries": 665, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764797669, "oldest_key_time": 1764797669, "file_creation_time": 1764797751, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 46, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 9517 microseconds, and 4034 cpu microseconds.
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.782705) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #46: 968645 bytes OK
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.782724) [db/memtable_list.cc:519] [default] Level-0 commit table #46 started
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.784108) [db/memtable_list.cc:722] [default] Level-0 commit table #46: memtable #1 done
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.784120) EVENT_LOG_v1 {"time_micros": 1764797751784117, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.784138) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 987385, prev total WAL file size 987385, number of live WAL files 2.
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000042.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.784627) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323531' seq:72057594037927935, type:22 .. '6C6F676D00353033' seq:0, type:0; will stop at (end)
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [46(945KB)], [44(4777KB)]
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797751784675, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [46], "files_L6": [44], "score": -1, "input_data_size": 5860416, "oldest_snapshot_seqno": -1}
Dec 03 21:35:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:35:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:35:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:35:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:35:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:35:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #47: 3913 keys, 5760977 bytes, temperature: kUnknown
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797751821660, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 47, "file_size": 5760977, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5732619, "index_size": 17496, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9797, "raw_key_size": 94350, "raw_average_key_size": 24, "raw_value_size": 5660112, "raw_average_value_size": 1446, "num_data_blocks": 746, "num_entries": 3913, "num_filter_entries": 3913, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764797751, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.821939) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 5760977 bytes
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.823167) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.1 rd, 155.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 4.7 +0.0 blob) out(5.5 +0.0 blob), read-write-amplify(12.0) write-amplify(5.9) OK, records in: 4437, records dropped: 524 output_compression: NoCompression
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.823188) EVENT_LOG_v1 {"time_micros": 1764797751823178, "job": 22, "event": "compaction_finished", "compaction_time_micros": 37075, "compaction_time_cpu_micros": 14005, "output_level": 6, "num_output_files": 1, "total_output_size": 5760977, "num_input_records": 4437, "num_output_records": 3913, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000046.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797751823538, "job": 22, "event": "table_file_deletion", "file_number": 46}
Dec 03 21:35:51 compute-0 ceph-mon[75204]: pgmap v925: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797751824728, "job": 22, "event": "table_file_deletion", "file_number": 44}
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.784536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.824782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.824787) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.824789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.824791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:35:51 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:35:51.824792) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:35:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v926: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:53 compute-0 ceph-mon[75204]: pgmap v926: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v927: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:55 compute-0 ceph-mon[75204]: pgmap v927: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v928: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:35:57 compute-0 ceph-mon[75204]: pgmap v928: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:58 compute-0 podman[255950]: 2025-12-03 21:35:58.177668531 +0000 UTC m=+0.116591373 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec 03 21:35:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v929: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:35:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1783887507' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:35:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:35:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1783887507' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:35:59 compute-0 ceph-mon[75204]: pgmap v929: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:35:59 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1783887507' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:35:59 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1783887507' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:36:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v930: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:01 compute-0 ceph-mon[75204]: pgmap v930: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v931: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:03 compute-0 ceph-mon[75204]: pgmap v931: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v932: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:05 compute-0 ceph-mon[75204]: pgmap v932: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v933: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:07 compute-0 ceph-mon[75204]: pgmap v933: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v934: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:09 compute-0 ceph-mon[75204]: pgmap v934: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v935: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:11 compute-0 ceph-mon[75204]: pgmap v935: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v936: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:13 compute-0 podman[255977]: 2025-12-03 21:36:13.137253306 +0000 UTC m=+0.070410164 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 03 21:36:13 compute-0 podman[255976]: 2025-12-03 21:36:13.145523242 +0000 UTC m=+0.083987124 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 03 21:36:13 compute-0 ceph-mon[75204]: pgmap v936: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v937: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:15 compute-0 ceph-mon[75204]: pgmap v937: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v938: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:17 compute-0 ceph-mon[75204]: pgmap v938: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v939: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:19 compute-0 ceph-mon[75204]: pgmap v939: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v940: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:36:21
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.data', 'volumes', '.mgr', 'vms', 'cephfs.cephfs.meta', 'images', 'backups']
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:36:21 compute-0 nova_compute[241566]: 2025-12-03 21:36:21.569 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:36:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:36:21 compute-0 ceph-mon[75204]: pgmap v940: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v941: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:23 compute-0 nova_compute[241566]: 2025-12-03 21:36:23.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:23 compute-0 nova_compute[241566]: 2025-12-03 21:36:23.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:23 compute-0 nova_compute[241566]: 2025-12-03 21:36:23.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:23 compute-0 ceph-mon[75204]: pgmap v941: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v942: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:24 compute-0 nova_compute[241566]: 2025-12-03 21:36:24.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:24 compute-0 nova_compute[241566]: 2025-12-03 21:36:24.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:36:24 compute-0 nova_compute[241566]: 2025-12-03 21:36:24.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:36:24 compute-0 nova_compute[241566]: 2025-12-03 21:36:24.571 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:36:24 compute-0 nova_compute[241566]: 2025-12-03 21:36:24.571 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:24 compute-0 nova_compute[241566]: 2025-12-03 21:36:24.572 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:24 compute-0 nova_compute[241566]: 2025-12-03 21:36:24.572 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:36:25 compute-0 nova_compute[241566]: 2025-12-03 21:36:25.553 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:26 compute-0 ceph-mon[75204]: pgmap v942: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v943: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:26 compute-0 nova_compute[241566]: 2025-12-03 21:36:26.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:36:26 compute-0 nova_compute[241566]: 2025-12-03 21:36:26.577 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:36:26 compute-0 nova_compute[241566]: 2025-12-03 21:36:26.578 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:36:26 compute-0 nova_compute[241566]: 2025-12-03 21:36:26.578 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:36:26 compute-0 nova_compute[241566]: 2025-12-03 21:36:26.578 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:36:26 compute-0 nova_compute[241566]: 2025-12-03 21:36:26.579 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:36:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:36:27 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/931043877' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:36:27 compute-0 nova_compute[241566]: 2025-12-03 21:36:27.125 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.546s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:36:27 compute-0 nova_compute[241566]: 2025-12-03 21:36:27.384 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:36:27 compute-0 nova_compute[241566]: 2025-12-03 21:36:27.385 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5135MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:36:27 compute-0 nova_compute[241566]: 2025-12-03 21:36:27.385 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:36:27 compute-0 nova_compute[241566]: 2025-12-03 21:36:27.386 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:36:27 compute-0 nova_compute[241566]: 2025-12-03 21:36:27.457 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:36:27 compute-0 nova_compute[241566]: 2025-12-03 21:36:27.457 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:36:27 compute-0 nova_compute[241566]: 2025-12-03 21:36:27.480 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:36:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:36:28 compute-0 ceph-mon[75204]: pgmap v943: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:28 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/931043877' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:36:28 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:36:28 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2803145137' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:36:28 compute-0 nova_compute[241566]: 2025-12-03 21:36:28.050 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:36:28 compute-0 nova_compute[241566]: 2025-12-03 21:36:28.055 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:36:28 compute-0 nova_compute[241566]: 2025-12-03 21:36:28.068 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:36:28 compute-0 nova_compute[241566]: 2025-12-03 21:36:28.070 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:36:28 compute-0 nova_compute[241566]: 2025-12-03 21:36:28.070 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:36:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v944: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:29 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2803145137' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:36:29 compute-0 podman[256055]: 2025-12-03 21:36:29.159510543 +0000 UTC m=+0.098959092 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 03 21:36:30 compute-0 ceph-mon[75204]: pgmap v944: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v945: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:32 compute-0 ceph-mon[75204]: pgmap v945: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v946: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:34 compute-0 ceph-mon[75204]: pgmap v946: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v947: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:36 compute-0 ceph-mon[75204]: pgmap v947: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v948: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:38 compute-0 ceph-mon[75204]: pgmap v948: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v949: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:40 compute-0 ceph-mon[75204]: pgmap v949: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v950: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:40 compute-0 sudo[256082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:36:40 compute-0 sudo[256082]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:40 compute-0 sudo[256082]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:41 compute-0 sudo[256107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:36:41 compute-0 sudo[256107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:41 compute-0 sudo[256107]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:36:41 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:36:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:36:41 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:36:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:36:41 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:36:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:36:41 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:36:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:36:41 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:36:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:36:41 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:36:42 compute-0 sudo[256161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:36:42 compute-0 sudo[256161]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:42 compute-0 sudo[256161]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:42 compute-0 ceph-mon[75204]: pgmap v950: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:36:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:36:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:36:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:36:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:36:42 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:36:42 compute-0 sudo[256186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:36:42 compute-0 sudo[256186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v951: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:42 compute-0 podman[256223]: 2025-12-03 21:36:42.47688102 +0000 UTC m=+0.065131428 container create 4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_napier, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:36:42 compute-0 systemd[1]: Started libpod-conmon-4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6.scope.
Dec 03 21:36:42 compute-0 podman[256223]: 2025-12-03 21:36:42.44609528 +0000 UTC m=+0.034345748 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:36:42 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:36:42 compute-0 podman[256223]: 2025-12-03 21:36:42.588743473 +0000 UTC m=+0.176993961 container init 4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_napier, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:36:42 compute-0 podman[256223]: 2025-12-03 21:36:42.602402146 +0000 UTC m=+0.190652574 container start 4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_napier, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:36:42 compute-0 podman[256223]: 2025-12-03 21:36:42.606771274 +0000 UTC m=+0.195021792 container attach 4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_napier, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:36:42 compute-0 confident_napier[256239]: 167 167
Dec 03 21:36:42 compute-0 systemd[1]: libpod-4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6.scope: Deactivated successfully.
Dec 03 21:36:42 compute-0 podman[256223]: 2025-12-03 21:36:42.611794942 +0000 UTC m=+0.200045360 container died 4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_napier, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:36:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-5fd369d795dfc5d1e80430c396926db84ca782c64ea4ab8c8515788f73e313ee-merged.mount: Deactivated successfully.
Dec 03 21:36:42 compute-0 podman[256223]: 2025-12-03 21:36:42.66082268 +0000 UTC m=+0.249073068 container remove 4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_napier, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:36:42 compute-0 systemd[1]: libpod-conmon-4ac139be6ce0fbc35abf07682e2607f6622f2f0c67f6f32f94a337bad76574d6.scope: Deactivated successfully.
Dec 03 21:36:42 compute-0 podman[256263]: 2025-12-03 21:36:42.875527209 +0000 UTC m=+0.062991799 container create d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=boring_cray, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030)
Dec 03 21:36:42 compute-0 systemd[1]: Started libpod-conmon-d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6.scope.
Dec 03 21:36:42 compute-0 podman[256263]: 2025-12-03 21:36:42.846921189 +0000 UTC m=+0.034385839 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:36:42 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:36:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f4bd19d4b9d3680a38f451b373d22e7d57d250722270aef5f7774d57480723e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f4bd19d4b9d3680a38f451b373d22e7d57d250722270aef5f7774d57480723e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f4bd19d4b9d3680a38f451b373d22e7d57d250722270aef5f7774d57480723e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f4bd19d4b9d3680a38f451b373d22e7d57d250722270aef5f7774d57480723e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f4bd19d4b9d3680a38f451b373d22e7d57d250722270aef5f7774d57480723e/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:42 compute-0 podman[256263]: 2025-12-03 21:36:42.99828417 +0000 UTC m=+0.185748740 container init d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=boring_cray, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 03 21:36:43 compute-0 podman[256263]: 2025-12-03 21:36:43.010292847 +0000 UTC m=+0.197757417 container start d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=boring_cray, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:36:43 compute-0 podman[256263]: 2025-12-03 21:36:43.014287017 +0000 UTC m=+0.201751617 container attach d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=boring_cray, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:36:43 compute-0 boring_cray[256279]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:36:43 compute-0 boring_cray[256279]: --> All data devices are unavailable
Dec 03 21:36:43 compute-0 systemd[1]: libpod-d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6.scope: Deactivated successfully.
Dec 03 21:36:43 compute-0 podman[256263]: 2025-12-03 21:36:43.637142125 +0000 UTC m=+0.824606715 container died d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=boring_cray, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:36:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-1f4bd19d4b9d3680a38f451b373d22e7d57d250722270aef5f7774d57480723e-merged.mount: Deactivated successfully.
Dec 03 21:36:43 compute-0 podman[256263]: 2025-12-03 21:36:43.715095183 +0000 UTC m=+0.902559783 container remove d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=boring_cray, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:36:43 compute-0 systemd[1]: libpod-conmon-d20e87ad908c27d47a6bc78d30fa104919aadcffcf89866f5f4861444275eff6.scope: Deactivated successfully.
Dec 03 21:36:43 compute-0 sudo[256186]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:43 compute-0 podman[256301]: 2025-12-03 21:36:43.793223205 +0000 UTC m=+0.105500560 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:36:43 compute-0 podman[256308]: 2025-12-03 21:36:43.811621427 +0000 UTC m=+0.115346629 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:36:43 compute-0 sudo[256349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:36:43 compute-0 sudo[256349]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:43 compute-0 sudo[256349]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:43 compute-0 sudo[256375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:36:43 compute-0 sudo[256375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:44 compute-0 ceph-mon[75204]: pgmap v951: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:44 compute-0 podman[256412]: 2025-12-03 21:36:44.308708274 +0000 UTC m=+0.051505307 container create 3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_fermi, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:36:44 compute-0 systemd[1]: Started libpod-conmon-3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228.scope.
Dec 03 21:36:44 compute-0 podman[256412]: 2025-12-03 21:36:44.284712109 +0000 UTC m=+0.027509232 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:36:44 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:36:44 compute-0 podman[256412]: 2025-12-03 21:36:44.417013509 +0000 UTC m=+0.159810592 container init 3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_fermi, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, ceph=True)
Dec 03 21:36:44 compute-0 podman[256412]: 2025-12-03 21:36:44.425283995 +0000 UTC m=+0.168081048 container start 3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_fermi, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS)
Dec 03 21:36:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v952: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:44 compute-0 podman[256412]: 2025-12-03 21:36:44.428451351 +0000 UTC m=+0.171248484 container attach 3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_fermi, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3)
Dec 03 21:36:44 compute-0 infallible_fermi[256430]: 167 167
Dec 03 21:36:44 compute-0 systemd[1]: libpod-3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228.scope: Deactivated successfully.
Dec 03 21:36:44 compute-0 conmon[256430]: conmon 3ac16ffc0b5e1ca35057 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228.scope/container/memory.events
Dec 03 21:36:44 compute-0 podman[256412]: 2025-12-03 21:36:44.43354226 +0000 UTC m=+0.176339313 container died 3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_fermi, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:36:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-cd1404f9862578600ed65f2f99c45f3102ec9086fde2ccf7fc595f561b5ecea5-merged.mount: Deactivated successfully.
Dec 03 21:36:44 compute-0 podman[256412]: 2025-12-03 21:36:44.484636516 +0000 UTC m=+0.227433569 container remove 3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=infallible_fermi, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:36:44 compute-0 systemd[1]: libpod-conmon-3ac16ffc0b5e1ca350579be8d70a3a8698172dfe077b9c2912a7c61764301228.scope: Deactivated successfully.
Dec 03 21:36:44 compute-0 podman[256454]: 2025-12-03 21:36:44.719382451 +0000 UTC m=+0.063760750 container create 89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=hopeful_dewdney, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:36:44 compute-0 systemd[1]: Started libpod-conmon-89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f.scope.
Dec 03 21:36:44 compute-0 podman[256454]: 2025-12-03 21:36:44.690426621 +0000 UTC m=+0.034804970 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:36:44 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:36:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d5d8a8af7706f4a3ee5dd6a2628480495a375fb7a939224c66016b2cd9430e1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d5d8a8af7706f4a3ee5dd6a2628480495a375fb7a939224c66016b2cd9430e1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d5d8a8af7706f4a3ee5dd6a2628480495a375fb7a939224c66016b2cd9430e1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d5d8a8af7706f4a3ee5dd6a2628480495a375fb7a939224c66016b2cd9430e1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:44 compute-0 podman[256454]: 2025-12-03 21:36:44.822961958 +0000 UTC m=+0.167340297 container init 89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=hopeful_dewdney, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:36:44 compute-0 podman[256454]: 2025-12-03 21:36:44.840686783 +0000 UTC m=+0.185065072 container start 89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=hopeful_dewdney, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3)
Dec 03 21:36:44 compute-0 podman[256454]: 2025-12-03 21:36:44.844897687 +0000 UTC m=+0.189275986 container attach 89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=hopeful_dewdney, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle)
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]: {
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:     "0": [
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:         {
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "devices": [
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "/dev/loop3"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             ],
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_name": "ceph_lv0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_size": "21470642176",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "name": "ceph_lv0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "tags": {
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cluster_name": "ceph",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.crush_device_class": "",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.encrypted": "0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.objectstore": "bluestore",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osd_id": "0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.type": "block",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.vdo": "0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.with_tpm": "0"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             },
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "type": "block",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "vg_name": "ceph_vg0"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:         }
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:     ],
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:     "1": [
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:         {
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "devices": [
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "/dev/loop4"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             ],
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_name": "ceph_lv1",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_size": "21470642176",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "name": "ceph_lv1",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "tags": {
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cluster_name": "ceph",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.crush_device_class": "",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.encrypted": "0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.objectstore": "bluestore",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osd_id": "1",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.type": "block",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.vdo": "0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.with_tpm": "0"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             },
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "type": "block",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "vg_name": "ceph_vg1"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:         }
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:     ],
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:     "2": [
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:         {
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "devices": [
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "/dev/loop5"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             ],
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_name": "ceph_lv2",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_size": "21470642176",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "name": "ceph_lv2",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "tags": {
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.cluster_name": "ceph",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.crush_device_class": "",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.encrypted": "0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.objectstore": "bluestore",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osd_id": "2",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.type": "block",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.vdo": "0",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:                 "ceph.with_tpm": "0"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             },
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "type": "block",
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:             "vg_name": "ceph_vg2"
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:         }
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]:     ]
Dec 03 21:36:45 compute-0 hopeful_dewdney[256470]: }
Dec 03 21:36:45 compute-0 systemd[1]: libpod-89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f.scope: Deactivated successfully.
Dec 03 21:36:45 compute-0 podman[256454]: 2025-12-03 21:36:45.21141126 +0000 UTC m=+0.555789549 container died 89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=hopeful_dewdney, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:36:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-3d5d8a8af7706f4a3ee5dd6a2628480495a375fb7a939224c66016b2cd9430e1-merged.mount: Deactivated successfully.
Dec 03 21:36:45 compute-0 podman[256454]: 2025-12-03 21:36:45.270192704 +0000 UTC m=+0.614571003 container remove 89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=hopeful_dewdney, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:36:45 compute-0 systemd[1]: libpod-conmon-89ecbda4a9ed25dc794db408d2bc209bc1c2fde3d8baadb008e2c250e6e7ce8f.scope: Deactivated successfully.
Dec 03 21:36:45 compute-0 sudo[256375]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:45 compute-0 sudo[256489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:36:45 compute-0 sudo[256489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:45 compute-0 sudo[256489]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:45 compute-0 sudo[256514]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:36:45 compute-0 sudo[256514]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:45 compute-0 podman[256552]: 2025-12-03 21:36:45.90980149 +0000 UTC m=+0.077464084 container create 30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=upbeat_wright, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 03 21:36:45 compute-0 systemd[1]: Started libpod-conmon-30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0.scope.
Dec 03 21:36:45 compute-0 podman[256552]: 2025-12-03 21:36:45.879411991 +0000 UTC m=+0.047074675 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:36:45 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:36:45 compute-0 podman[256552]: 2025-12-03 21:36:45.995638073 +0000 UTC m=+0.163300937 container init 30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=upbeat_wright, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 03 21:36:46 compute-0 podman[256552]: 2025-12-03 21:36:46.006685915 +0000 UTC m=+0.174348539 container start 30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=upbeat_wright, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:36:46 compute-0 podman[256552]: 2025-12-03 21:36:46.010846398 +0000 UTC m=+0.178509022 container attach 30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=upbeat_wright, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:36:46 compute-0 upbeat_wright[256568]: 167 167
Dec 03 21:36:46 compute-0 systemd[1]: libpod-30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0.scope: Deactivated successfully.
Dec 03 21:36:46 compute-0 podman[256552]: 2025-12-03 21:36:46.015760322 +0000 UTC m=+0.183422946 container died 30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=upbeat_wright, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:36:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-e3c571f2e42c747a83bdcc54ef23a463e061c7d37a08b37fedbb698319b8071b-merged.mount: Deactivated successfully.
Dec 03 21:36:46 compute-0 ceph-mon[75204]: pgmap v952: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:46 compute-0 podman[256552]: 2025-12-03 21:36:46.107742153 +0000 UTC m=+0.275404767 container remove 30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=upbeat_wright, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:36:46 compute-0 systemd[1]: libpod-conmon-30605f8709867d1c8937eac75e577c7a1d71638dab74430ce62020d45708f7a0.scope: Deactivated successfully.
Dec 03 21:36:46 compute-0 podman[256593]: 2025-12-03 21:36:46.32494302 +0000 UTC m=+0.054204830 container create e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_thompson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:36:46 compute-0 systemd[1]: Started libpod-conmon-e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95.scope.
Dec 03 21:36:46 compute-0 podman[256593]: 2025-12-03 21:36:46.301983094 +0000 UTC m=+0.031244904 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:36:46 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:36:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52f2f54ec53b70753de59180862c3884ba37f2b579fb30e2f16f71dcca753c8c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52f2f54ec53b70753de59180862c3884ba37f2b579fb30e2f16f71dcca753c8c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52f2f54ec53b70753de59180862c3884ba37f2b579fb30e2f16f71dcca753c8c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52f2f54ec53b70753de59180862c3884ba37f2b579fb30e2f16f71dcca753c8c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:36:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v953: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:46 compute-0 podman[256593]: 2025-12-03 21:36:46.436259578 +0000 UTC m=+0.165521488 container init e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_thompson, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:36:46 compute-0 podman[256593]: 2025-12-03 21:36:46.444168064 +0000 UTC m=+0.173429894 container start e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_thompson, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:36:46 compute-0 podman[256593]: 2025-12-03 21:36:46.448842512 +0000 UTC m=+0.178104392 container attach e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_thompson, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2)
Dec 03 21:36:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:47 compute-0 lvm[256687]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:36:47 compute-0 lvm[256690]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:36:47 compute-0 lvm[256691]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:36:47 compute-0 lvm[256690]: VG ceph_vg1 finished
Dec 03 21:36:47 compute-0 lvm[256691]: VG ceph_vg2 finished
Dec 03 21:36:47 compute-0 lvm[256687]: VG ceph_vg0 finished
Dec 03 21:36:47 compute-0 stupefied_thompson[256610]: {}
Dec 03 21:36:47 compute-0 systemd[1]: libpod-e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95.scope: Deactivated successfully.
Dec 03 21:36:47 compute-0 systemd[1]: libpod-e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95.scope: Consumed 1.629s CPU time.
Dec 03 21:36:47 compute-0 podman[256593]: 2025-12-03 21:36:47.509612622 +0000 UTC m=+1.238874442 container died e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_thompson, org.label-schema.build-date=20251030, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:36:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-52f2f54ec53b70753de59180862c3884ba37f2b579fb30e2f16f71dcca753c8c-merged.mount: Deactivated successfully.
Dec 03 21:36:47 compute-0 podman[256593]: 2025-12-03 21:36:47.559921825 +0000 UTC m=+1.289183625 container remove e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_thompson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:36:47 compute-0 systemd[1]: libpod-conmon-e960e1a1ed71f8973c8865669435509cf8e4f9a2e7be43a3258326c33262ef95.scope: Deactivated successfully.
Dec 03 21:36:47 compute-0 sudo[256514]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:36:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:36:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:36:47 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:36:47 compute-0 sudo[256704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:36:47 compute-0 sudo[256704]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:36:47 compute-0 sudo[256704]: pam_unix(sudo:session): session closed for user root
Dec 03 21:36:48 compute-0 ceph-mon[75204]: pgmap v953: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:36:48 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:36:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v954: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:36:48.941 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:36:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:36:48.942 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:36:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:36:48.943 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:36:50 compute-0 ceph-mon[75204]: pgmap v954: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v955: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:36:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:36:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:36:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:36:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:36:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:36:52 compute-0 ceph-mon[75204]: pgmap v955: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v956: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:54 compute-0 ceph-mon[75204]: pgmap v956: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v957: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:56 compute-0 ceph-mon[75204]: pgmap v957: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v958: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:36:58 compute-0 ceph-mon[75204]: pgmap v958: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v959: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:36:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:36:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1084562545' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:36:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:36:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1084562545' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:37:00 compute-0 ceph-mon[75204]: pgmap v959: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1084562545' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:37:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1084562545' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:37:00 compute-0 podman[256729]: 2025-12-03 21:37:00.214292237 +0000 UTC m=+0.140580819 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 21:37:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v960: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:02 compute-0 ceph-mon[75204]: pgmap v960: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v961: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:04 compute-0 ceph-mon[75204]: pgmap v961: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v962: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:06 compute-0 ceph-mon[75204]: pgmap v962: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v963: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:08 compute-0 ceph-mon[75204]: pgmap v963: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v964: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:10 compute-0 ceph-mon[75204]: pgmap v964: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v965: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:12 compute-0 ceph-mon[75204]: pgmap v965: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v966: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:14 compute-0 podman[256757]: 2025-12-03 21:37:14.145664869 +0000 UTC m=+0.077183038 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:37:14 compute-0 podman[256756]: 2025-12-03 21:37:14.162439717 +0000 UTC m=+0.091689674 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 21:37:14 compute-0 ceph-mon[75204]: pgmap v966: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v967: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:16 compute-0 ceph-mon[75204]: pgmap v967: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v968: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:17 compute-0 nova_compute[241566]: 2025-12-03 21:37:17.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:17 compute-0 nova_compute[241566]: 2025-12-03 21:37:17.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 03 21:37:18 compute-0 ceph-mon[75204]: pgmap v968: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v969: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:18 compute-0 nova_compute[241566]: 2025-12-03 21:37:18.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:19 compute-0 nova_compute[241566]: 2025-12-03 21:37:19.576 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:19 compute-0 nova_compute[241566]: 2025-12-03 21:37:19.577 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 03 21:37:19 compute-0 nova_compute[241566]: 2025-12-03 21:37:19.599 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 03 21:37:20 compute-0 ceph-mon[75204]: pgmap v969: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v970: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:37:21
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.data', 'images', '.mgr', 'cephfs.cephfs.meta', 'backups', 'volumes', 'vms']
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:37:21 compute-0 nova_compute[241566]: 2025-12-03 21:37:21.568 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:37:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:37:22 compute-0 ceph-mon[75204]: pgmap v970: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v971: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:23 compute-0 nova_compute[241566]: 2025-12-03 21:37:23.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:23 compute-0 nova_compute[241566]: 2025-12-03 21:37:23.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:24 compute-0 ceph-mon[75204]: pgmap v971: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v972: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:24 compute-0 nova_compute[241566]: 2025-12-03 21:37:24.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:24 compute-0 nova_compute[241566]: 2025-12-03 21:37:24.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:37:24 compute-0 nova_compute[241566]: 2025-12-03 21:37:24.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:37:24 compute-0 nova_compute[241566]: 2025-12-03 21:37:24.579 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:37:24 compute-0 nova_compute[241566]: 2025-12-03 21:37:24.580 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:25 compute-0 nova_compute[241566]: 2025-12-03 21:37:25.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:25 compute-0 nova_compute[241566]: 2025-12-03 21:37:25.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:25 compute-0 nova_compute[241566]: 2025-12-03 21:37:25.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:37:26 compute-0 ceph-mon[75204]: pgmap v972: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v973: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:27 compute-0 nova_compute[241566]: 2025-12-03 21:37:27.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:37:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:37:28 compute-0 ceph-mon[75204]: pgmap v973: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v974: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:28 compute-0 nova_compute[241566]: 2025-12-03 21:37:28.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:28 compute-0 nova_compute[241566]: 2025-12-03 21:37:28.605 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:37:28 compute-0 nova_compute[241566]: 2025-12-03 21:37:28.606 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:37:28 compute-0 nova_compute[241566]: 2025-12-03 21:37:28.606 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:37:28 compute-0 nova_compute[241566]: 2025-12-03 21:37:28.606 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:37:28 compute-0 nova_compute[241566]: 2025-12-03 21:37:28.607 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:37:29 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:37:29 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/643660393' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.169 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.562s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:37:29 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/643660393' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.406 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.407 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5142MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.407 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.408 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.644 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.645 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.770 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing inventories for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.928 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Updating ProviderTree inventory for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.929 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Updating inventory in ProviderTree for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.957 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing aggregate associations for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 03 21:37:29 compute-0 nova_compute[241566]: 2025-12-03 21:37:29.988 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing trait associations for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775, traits: HW_CPU_X86_SSE4A,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AESNI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_F16C,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 03 21:37:30 compute-0 nova_compute[241566]: 2025-12-03 21:37:30.013 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:37:30 compute-0 ceph-mon[75204]: pgmap v974: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v975: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:37:30 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1964727865' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:37:30 compute-0 nova_compute[241566]: 2025-12-03 21:37:30.558 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:37:30 compute-0 nova_compute[241566]: 2025-12-03 21:37:30.566 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:37:30 compute-0 nova_compute[241566]: 2025-12-03 21:37:30.739 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:37:30 compute-0 nova_compute[241566]: 2025-12-03 21:37:30.742 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:37:30 compute-0 nova_compute[241566]: 2025-12-03 21:37:30.743 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:37:31 compute-0 podman[256837]: 2025-12-03 21:37:31.212986768 +0000 UTC m=+0.148189786 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:37:31 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1964727865' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:37:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:32 compute-0 ceph-mon[75204]: pgmap v975: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v976: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:34 compute-0 ceph-mon[75204]: pgmap v976: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v977: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:34 compute-0 nova_compute[241566]: 2025-12-03 21:37:34.738 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:36 compute-0 ceph-mon[75204]: pgmap v977: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v978: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:38 compute-0 ceph-mon[75204]: pgmap v978: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v979: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:40 compute-0 ceph-mon[75204]: pgmap v979: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v980: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:42 compute-0 ceph-mon[75204]: pgmap v980: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v981: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:44 compute-0 ceph-mon[75204]: pgmap v981: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v982: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:45 compute-0 podman[256864]: 2025-12-03 21:37:45.16301807 +0000 UTC m=+0.078865423 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec 03 21:37:45 compute-0 podman[256863]: 2025-12-03 21:37:45.16923746 +0000 UTC m=+0.089766121 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 21:37:45 compute-0 nova_compute[241566]: 2025-12-03 21:37:45.968 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:37:46 compute-0 ceph-mon[75204]: pgmap v982: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v983: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:47 compute-0 sudo[256897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:37:47 compute-0 sudo[256897]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:47 compute-0 sudo[256897]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:47 compute-0 sudo[256922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:37:47 compute-0 sudo[256922]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:48 compute-0 ceph-mon[75204]: pgmap v983: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v984: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:48 compute-0 sudo[256922]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:37:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:37:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:37:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:37:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:37:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:37:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:37:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:37:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:37:48 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:37:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:37:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:37:48 compute-0 sudo[256979]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:37:48 compute-0 sudo[256979]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:48 compute-0 sudo[256979]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:48 compute-0 sudo[257004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:37:48 compute-0 sudo[257004]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:37:48.942 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:37:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:37:48.943 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:37:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:37:48.943 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:37:49 compute-0 podman[257041]: 2025-12-03 21:37:49.101883309 +0000 UTC m=+0.049159273 container create 891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_jennings, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:37:49 compute-0 systemd[1]: Started libpod-conmon-891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54.scope.
Dec 03 21:37:49 compute-0 podman[257041]: 2025-12-03 21:37:49.080918326 +0000 UTC m=+0.028194320 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:37:49 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:37:49 compute-0 podman[257041]: 2025-12-03 21:37:49.201298761 +0000 UTC m=+0.148574755 container init 891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_jennings, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 03 21:37:49 compute-0 podman[257041]: 2025-12-03 21:37:49.21224216 +0000 UTC m=+0.159518124 container start 891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_jennings, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:37:49 compute-0 podman[257041]: 2025-12-03 21:37:49.216074114 +0000 UTC m=+0.163350118 container attach 891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_jennings, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:37:49 compute-0 unruffled_jennings[257058]: 167 167
Dec 03 21:37:49 compute-0 systemd[1]: libpod-891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54.scope: Deactivated successfully.
Dec 03 21:37:49 compute-0 conmon[257058]: conmon 891aecf24436cd99aefe <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54.scope/container/memory.events
Dec 03 21:37:49 compute-0 podman[257041]: 2025-12-03 21:37:49.222441839 +0000 UTC m=+0.169717803 container died 891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_jennings, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle)
Dec 03 21:37:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-2cd2a00657e0e2d2613a153bf316f3c621e951992bc086243ed8893b49153873-merged.mount: Deactivated successfully.
Dec 03 21:37:49 compute-0 podman[257041]: 2025-12-03 21:37:49.276030571 +0000 UTC m=+0.223306525 container remove 891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_jennings, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:37:49 compute-0 systemd[1]: libpod-conmon-891aecf24436cd99aefe9aff478404b65496cc324f98dc4d0a99194bfa6f4d54.scope: Deactivated successfully.
Dec 03 21:37:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:37:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:37:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:37:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:37:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:37:49 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:37:49 compute-0 podman[257083]: 2025-12-03 21:37:49.535910584 +0000 UTC m=+0.076099189 container create ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_booth, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, CEPH_REF=tentacle)
Dec 03 21:37:49 compute-0 systemd[1]: Started libpod-conmon-ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54.scope.
Dec 03 21:37:49 compute-0 podman[257083]: 2025-12-03 21:37:49.501901956 +0000 UTC m=+0.042090621 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:37:49 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:37:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7648b07cdf318bc487d196b13a0b5030b75f79ee282e22d41b054cb7ebc3b753/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7648b07cdf318bc487d196b13a0b5030b75f79ee282e22d41b054cb7ebc3b753/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7648b07cdf318bc487d196b13a0b5030b75f79ee282e22d41b054cb7ebc3b753/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7648b07cdf318bc487d196b13a0b5030b75f79ee282e22d41b054cb7ebc3b753/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7648b07cdf318bc487d196b13a0b5030b75f79ee282e22d41b054cb7ebc3b753/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:49 compute-0 podman[257083]: 2025-12-03 21:37:49.642836792 +0000 UTC m=+0.183025467 container init ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_booth, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 03 21:37:49 compute-0 podman[257083]: 2025-12-03 21:37:49.661180552 +0000 UTC m=+0.201369157 container start ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_booth, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:37:49 compute-0 podman[257083]: 2025-12-03 21:37:49.665270204 +0000 UTC m=+0.205458819 container attach ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_booth, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 03 21:37:50 compute-0 competent_booth[257099]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:37:50 compute-0 competent_booth[257099]: --> All data devices are unavailable
Dec 03 21:37:50 compute-0 systemd[1]: libpod-ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54.scope: Deactivated successfully.
Dec 03 21:37:50 compute-0 podman[257083]: 2025-12-03 21:37:50.228440634 +0000 UTC m=+0.768629219 container died ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_booth, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:37:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-7648b07cdf318bc487d196b13a0b5030b75f79ee282e22d41b054cb7ebc3b753-merged.mount: Deactivated successfully.
Dec 03 21:37:50 compute-0 podman[257083]: 2025-12-03 21:37:50.271191671 +0000 UTC m=+0.811380286 container remove ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=competent_booth, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:37:50 compute-0 systemd[1]: libpod-conmon-ebdccf733a3b48fff59d14d002c705299209e7bc522ec30552c53fb83f7b7b54.scope: Deactivated successfully.
Dec 03 21:37:50 compute-0 sudo[257004]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:50 compute-0 ceph-mon[75204]: pgmap v984: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:50 compute-0 sudo[257131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:37:50 compute-0 sudo[257131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:50 compute-0 sudo[257131]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v985: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:50 compute-0 sudo[257156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:37:50 compute-0 sudo[257156]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:50 compute-0 podman[257193]: 2025-12-03 21:37:50.861345908 +0000 UTC m=+0.077592409 container create 1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_wing, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:37:50 compute-0 systemd[1]: Started libpod-conmon-1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88.scope.
Dec 03 21:37:50 compute-0 podman[257193]: 2025-12-03 21:37:50.828928592 +0000 UTC m=+0.045175143 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:37:50 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:37:50 compute-0 podman[257193]: 2025-12-03 21:37:50.959703112 +0000 UTC m=+0.175949583 container init 1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_wing, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:37:50 compute-0 podman[257193]: 2025-12-03 21:37:50.967488335 +0000 UTC m=+0.183734816 container start 1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_wing, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:37:50 compute-0 podman[257193]: 2025-12-03 21:37:50.971947926 +0000 UTC m=+0.188194397 container attach 1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_wing, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default)
Dec 03 21:37:50 compute-0 dreamy_wing[257209]: 167 167
Dec 03 21:37:50 compute-0 systemd[1]: libpod-1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88.scope: Deactivated successfully.
Dec 03 21:37:50 compute-0 podman[257193]: 2025-12-03 21:37:50.974436154 +0000 UTC m=+0.190682645 container died 1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_wing, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:37:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-3e6daeea07c61f83d784311bf3d1bf0790383b8388b60f8687fd96f5a2677154-merged.mount: Deactivated successfully.
Dec 03 21:37:51 compute-0 podman[257193]: 2025-12-03 21:37:51.027408239 +0000 UTC m=+0.243654730 container remove 1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=dreamy_wing, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle)
Dec 03 21:37:51 compute-0 systemd[1]: libpod-conmon-1fc8f092932eca0571501c8c60ddbf4ef8febf4d85c8eb7bbf2601f56b9b2b88.scope: Deactivated successfully.
Dec 03 21:37:51 compute-0 podman[257234]: 2025-12-03 21:37:51.276434786 +0000 UTC m=+0.069286562 container create ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_hypatia, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:37:51 compute-0 systemd[1]: Started libpod-conmon-ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9.scope.
Dec 03 21:37:51 compute-0 podman[257234]: 2025-12-03 21:37:51.249085249 +0000 UTC m=+0.041937115 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:37:51 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:37:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3acdeb2d533b86e1e295b2ede886b1639e0c8414410fee0f4383c3d6b48f91/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3acdeb2d533b86e1e295b2ede886b1639e0c8414410fee0f4383c3d6b48f91/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3acdeb2d533b86e1e295b2ede886b1639e0c8414410fee0f4383c3d6b48f91/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c3acdeb2d533b86e1e295b2ede886b1639e0c8414410fee0f4383c3d6b48f91/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:51 compute-0 podman[257234]: 2025-12-03 21:37:51.379997282 +0000 UTC m=+0.172849138 container init ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_hypatia, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:37:51 compute-0 podman[257234]: 2025-12-03 21:37:51.391987159 +0000 UTC m=+0.184838965 container start ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_hypatia, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:37:51 compute-0 podman[257234]: 2025-12-03 21:37:51.396857553 +0000 UTC m=+0.189709409 container attach ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_hypatia, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]: {
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:     "0": [
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:         {
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "devices": [
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "/dev/loop3"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             ],
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_name": "ceph_lv0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_size": "21470642176",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "name": "ceph_lv0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "tags": {
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cluster_name": "ceph",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.crush_device_class": "",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.encrypted": "0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.objectstore": "bluestore",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osd_id": "0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.type": "block",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.vdo": "0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.with_tpm": "0"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             },
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "type": "block",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "vg_name": "ceph_vg0"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:         }
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:     ],
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:     "1": [
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:         {
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "devices": [
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "/dev/loop4"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             ],
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_name": "ceph_lv1",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_size": "21470642176",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "name": "ceph_lv1",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "tags": {
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cluster_name": "ceph",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.crush_device_class": "",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.encrypted": "0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.objectstore": "bluestore",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osd_id": "1",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.type": "block",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.vdo": "0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.with_tpm": "0"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             },
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "type": "block",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "vg_name": "ceph_vg1"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:         }
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:     ],
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:     "2": [
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:         {
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "devices": [
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "/dev/loop5"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             ],
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_name": "ceph_lv2",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_size": "21470642176",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "name": "ceph_lv2",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "tags": {
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.cluster_name": "ceph",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.crush_device_class": "",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.encrypted": "0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.objectstore": "bluestore",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osd_id": "2",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.type": "block",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.vdo": "0",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:                 "ceph.with_tpm": "0"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             },
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "type": "block",
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:             "vg_name": "ceph_vg2"
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:         }
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]:     ]
Dec 03 21:37:51 compute-0 vigilant_hypatia[257250]: }
Dec 03 21:37:51 compute-0 systemd[1]: libpod-ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9.scope: Deactivated successfully.
Dec 03 21:37:51 compute-0 podman[257234]: 2025-12-03 21:37:51.702785032 +0000 UTC m=+0.495636798 container died ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_hypatia, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.build-date=20251030, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Dec 03 21:37:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-6c3acdeb2d533b86e1e295b2ede886b1639e0c8414410fee0f4383c3d6b48f91-merged.mount: Deactivated successfully.
Dec 03 21:37:51 compute-0 podman[257234]: 2025-12-03 21:37:51.753437704 +0000 UTC m=+0.546289500 container remove ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vigilant_hypatia, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:37:51 compute-0 systemd[1]: libpod-conmon-ee2b8434a384aaefdca1a1d70e659de9678bfb7f180b5a52b71959f0eeb8cbc9.scope: Deactivated successfully.
Dec 03 21:37:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:37:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:37:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:37:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:37:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:37:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:37:51 compute-0 sudo[257156]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:51 compute-0 sudo[257269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:37:51 compute-0 sudo[257269]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:51 compute-0 sudo[257269]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:51 compute-0 sudo[257294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:37:51 compute-0 sudo[257294]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:52 compute-0 podman[257331]: 2025-12-03 21:37:52.29704051 +0000 UTC m=+0.069304062 container create 3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_greider, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:37:52 compute-0 systemd[1]: Started libpod-conmon-3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5.scope.
Dec 03 21:37:52 compute-0 podman[257331]: 2025-12-03 21:37:52.272228103 +0000 UTC m=+0.044491735 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:37:52 compute-0 ceph-mon[75204]: pgmap v985: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:52 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:37:52 compute-0 podman[257331]: 2025-12-03 21:37:52.405627414 +0000 UTC m=+0.177891036 container init 3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_greider, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:37:52 compute-0 podman[257331]: 2025-12-03 21:37:52.417428325 +0000 UTC m=+0.189691918 container start 3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_greider, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:37:52 compute-0 podman[257331]: 2025-12-03 21:37:52.421133537 +0000 UTC m=+0.193397129 container attach 3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_greider, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=tentacle)
Dec 03 21:37:52 compute-0 jovial_greider[257348]: 167 167
Dec 03 21:37:52 compute-0 systemd[1]: libpod-3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5.scope: Deactivated successfully.
Dec 03 21:37:52 compute-0 podman[257331]: 2025-12-03 21:37:52.425828945 +0000 UTC m=+0.198092507 container died 3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_greider, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:37:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v986: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-9c4e616f43b77a946d908b666282335cb1d15fe16be21f7e420663eb0957ce3c-merged.mount: Deactivated successfully.
Dec 03 21:37:52 compute-0 podman[257331]: 2025-12-03 21:37:52.476703323 +0000 UTC m=+0.248966915 container remove 3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jovial_greider, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:37:52 compute-0 systemd[1]: libpod-conmon-3a72b3c76e0d4c5d52a65f4050974003e8fe238b8daeea9b318ff85cf86d4ec5.scope: Deactivated successfully.
Dec 03 21:37:52 compute-0 podman[257372]: 2025-12-03 21:37:52.731366233 +0000 UTC m=+0.064861391 container create dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_franklin, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 03 21:37:52 compute-0 systemd[1]: Started libpod-conmon-dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b.scope.
Dec 03 21:37:52 compute-0 podman[257372]: 2025-12-03 21:37:52.704514061 +0000 UTC m=+0.038009269 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:37:52 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:37:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46a74f1cedb60ceabcee54f7d4df8e23f078db706bce5971f712d6890e7a2e7b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46a74f1cedb60ceabcee54f7d4df8e23f078db706bce5971f712d6890e7a2e7b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46a74f1cedb60ceabcee54f7d4df8e23f078db706bce5971f712d6890e7a2e7b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46a74f1cedb60ceabcee54f7d4df8e23f078db706bce5971f712d6890e7a2e7b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:37:52 compute-0 podman[257372]: 2025-12-03 21:37:52.832467353 +0000 UTC m=+0.165962471 container init dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_franklin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:37:52 compute-0 podman[257372]: 2025-12-03 21:37:52.842687602 +0000 UTC m=+0.176182720 container start dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_franklin, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:37:52 compute-0 podman[257372]: 2025-12-03 21:37:52.845599961 +0000 UTC m=+0.179095079 container attach dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_franklin, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:37:53 compute-0 lvm[257468]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:37:53 compute-0 lvm[257469]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:37:53 compute-0 lvm[257469]: VG ceph_vg1 finished
Dec 03 21:37:53 compute-0 lvm[257468]: VG ceph_vg0 finished
Dec 03 21:37:53 compute-0 lvm[257471]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:37:53 compute-0 lvm[257471]: VG ceph_vg2 finished
Dec 03 21:37:53 compute-0 lvm[257473]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:37:53 compute-0 lvm[257473]: VG ceph_vg2 finished
Dec 03 21:37:53 compute-0 serene_franklin[257389]: {}
Dec 03 21:37:53 compute-0 lvm[257475]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:37:53 compute-0 lvm[257475]: VG ceph_vg2 finished
Dec 03 21:37:53 compute-0 systemd[1]: libpod-dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b.scope: Deactivated successfully.
Dec 03 21:37:53 compute-0 podman[257372]: 2025-12-03 21:37:53.767521712 +0000 UTC m=+1.101016870 container died dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_franklin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 21:37:53 compute-0 systemd[1]: libpod-dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b.scope: Consumed 1.513s CPU time.
Dec 03 21:37:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-46a74f1cedb60ceabcee54f7d4df8e23f078db706bce5971f712d6890e7a2e7b-merged.mount: Deactivated successfully.
Dec 03 21:37:53 compute-0 podman[257372]: 2025-12-03 21:37:53.825669329 +0000 UTC m=+1.159164457 container remove dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=serene_franklin, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:37:53 compute-0 systemd[1]: libpod-conmon-dfeb71da5b3b866e69a130d124962069e0045ee9e34b0df8b6c10ed3ff060a1b.scope: Deactivated successfully.
Dec 03 21:37:53 compute-0 sudo[257294]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:37:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:37:53 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:37:53 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:37:53 compute-0 sudo[257490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:37:53 compute-0 sudo[257490]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:37:53 compute-0 sudo[257490]: pam_unix(sudo:session): session closed for user root
Dec 03 21:37:54 compute-0 ceph-mon[75204]: pgmap v986: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:37:54 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:37:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v987: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:56 compute-0 ceph-mon[75204]: pgmap v987: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v988: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:56 compute-0 rsyslogd[1006]: imjournal: 15395 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 03 21:37:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:37:57 compute-0 ceph-mon[75204]: pgmap v988: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v989: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:59 compute-0 ceph-mon[75204]: pgmap v989: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:37:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:37:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3938407216' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:37:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:37:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3938407216' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:38:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v990: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3938407216' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:38:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3938407216' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #48. Immutable memtables: 0.
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.484139) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 48
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797880484169, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1268, "num_deletes": 251, "total_data_size": 1308922, "memory_usage": 1334464, "flush_reason": "Manual Compaction"}
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #49: started
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797880495288, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 49, "file_size": 1280505, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19351, "largest_seqno": 20618, "table_properties": {"data_size": 1274542, "index_size": 3294, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12605, "raw_average_key_size": 19, "raw_value_size": 1262502, "raw_average_value_size": 1975, "num_data_blocks": 151, "num_entries": 639, "num_filter_entries": 639, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764797751, "oldest_key_time": 1764797751, "file_creation_time": 1764797880, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 49, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 12049 microseconds, and 5610 cpu microseconds.
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.496153) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #49: 1280505 bytes OK
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.496233) [db/memtable_list.cc:519] [default] Level-0 commit table #49 started
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.498195) [db/memtable_list.cc:722] [default] Level-0 commit table #49: memtable #1 done
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.498237) EVENT_LOG_v1 {"time_micros": 1764797880498225, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.498265) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1303203, prev total WAL file size 1303203, number of live WAL files 2.
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000045.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.499306) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [49(1250KB)], [47(5625KB)]
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797880499369, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [49], "files_L6": [47], "score": -1, "input_data_size": 7041482, "oldest_snapshot_seqno": -1}
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #50: 4038 keys, 5844169 bytes, temperature: kUnknown
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797880543749, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 50, "file_size": 5844169, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5814906, "index_size": 18081, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10117, "raw_key_size": 97528, "raw_average_key_size": 24, "raw_value_size": 5740070, "raw_average_value_size": 1421, "num_data_blocks": 768, "num_entries": 4038, "num_filter_entries": 4038, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764797880, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.544203) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 5844169 bytes
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.545896) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.0 rd, 131.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 5.5 +0.0 blob) out(5.6 +0.0 blob), read-write-amplify(10.1) write-amplify(4.6) OK, records in: 4552, records dropped: 514 output_compression: NoCompression
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.545926) EVENT_LOG_v1 {"time_micros": 1764797880545910, "job": 24, "event": "compaction_finished", "compaction_time_micros": 44563, "compaction_time_cpu_micros": 26857, "output_level": 6, "num_output_files": 1, "total_output_size": 5844169, "num_input_records": 4552, "num_output_records": 4038, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000049.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797880546465, "job": 24, "event": "table_file_deletion", "file_number": 49}
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764797880548198, "job": 24, "event": "table_file_deletion", "file_number": 47}
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.499168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.548298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.548305) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.548309) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.548314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:38:00 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:38:00.548320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:38:01 compute-0 ceph-mon[75204]: pgmap v990: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:01 compute-0 anacron[34192]: Job `cron.weekly' started
Dec 03 21:38:01 compute-0 anacron[34192]: Job `cron.weekly' terminated
Dec 03 21:38:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:38:01 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 4617 writes, 20K keys, 4617 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4617 writes, 4617 syncs, 1.00 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1522 writes, 7118 keys, 1522 commit groups, 1.0 writes per commit group, ingest: 6.55 MB, 0.01 MB/s
                                           Interval WAL: 1522 writes, 1522 syncs, 1.00 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     97.1      0.18              0.07        12    0.015       0      0       0.0       0.0
                                             L6      1/0    5.57 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.2    128.8    106.4      0.53              0.21        11    0.049     42K   5814       0.0       0.0
                                            Sum      1/0    5.57 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.2     96.3    104.1      0.72              0.29        23    0.031     42K   5814       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.4     94.7     97.0      0.41              0.16        12    0.034     26K   3544       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    128.8    106.4      0.53              0.21        11    0.049     42K   5814       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     99.8      0.18              0.07        11    0.016       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     10.4      0.01              0.00         1    0.005       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.017, interval 0.007
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.07 GB write, 0.04 MB/s write, 0.07 GB read, 0.04 MB/s read, 0.7 seconds
                                           Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.06 MB/s read, 0.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x56170d6e38d0#2 capacity: 308.00 MB usage: 6.89 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 9.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(689,6.54 MB,2.12229%) FilterBlock(24,124.17 KB,0.0393706%) IndexBlock(24,241.20 KB,0.0764772%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 03 21:38:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:02 compute-0 podman[257517]: 2025-12-03 21:38:02.170659289 +0000 UTC m=+0.108987297 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 03 21:38:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v991: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:03 compute-0 ceph-mon[75204]: pgmap v991: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v992: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:05 compute-0 ceph-mon[75204]: pgmap v992: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v993: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:07 compute-0 ceph-mon[75204]: pgmap v993: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v994: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:09 compute-0 ceph-mon[75204]: pgmap v994: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v995: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:11 compute-0 ceph-mon[75204]: pgmap v995: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v996: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:13 compute-0 ceph-mon[75204]: pgmap v996: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v997: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:15 compute-0 ceph-mon[75204]: pgmap v997: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:16 compute-0 podman[257545]: 2025-12-03 21:38:16.127972672 +0000 UTC m=+0.066470646 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec 03 21:38:16 compute-0 podman[257544]: 2025-12-03 21:38:16.158762099 +0000 UTC m=+0.091937200 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 03 21:38:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v998: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:17 compute-0 ceph-mon[75204]: pgmap v998: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v999: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:19 compute-0 ceph-mon[75204]: pgmap v999: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1000: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:38:21
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['.mgr', 'cephfs.cephfs.data', 'cephfs.cephfs.meta', 'vms', 'backups', 'images', 'volumes']
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:38:21 compute-0 nova_compute[241566]: 2025-12-03 21:38:21.565 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:21 compute-0 ceph-mon[75204]: pgmap v1000: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:38:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:38:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:38:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1001: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:23 compute-0 ceph-mon[75204]: pgmap v1001: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1002: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:24 compute-0 nova_compute[241566]: 2025-12-03 21:38:24.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:24 compute-0 nova_compute[241566]: 2025-12-03 21:38:24.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:38:24 compute-0 nova_compute[241566]: 2025-12-03 21:38:24.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:38:24 compute-0 nova_compute[241566]: 2025-12-03 21:38:24.627 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:38:24 compute-0 nova_compute[241566]: 2025-12-03 21:38:24.628 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:24 compute-0 nova_compute[241566]: 2025-12-03 21:38:24.628 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:25 compute-0 nova_compute[241566]: 2025-12-03 21:38:25.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:25 compute-0 ceph-mon[75204]: pgmap v1002: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:25 compute-0 nova_compute[241566]: 2025-12-03 21:38:25.844 241570 DEBUG oslo_concurrency.processutils [None req-cf8f4e5e-8023-4cda-a194-a324473da3c2 1278db95002f4a508698b0c865809410 e82fda53634b410e910c801bc1b00db2 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:38:25 compute-0 nova_compute[241566]: 2025-12-03 21:38:25.886 241570 DEBUG oslo_concurrency.processutils [None req-cf8f4e5e-8023-4cda-a194-a324473da3c2 1278db95002f4a508698b0c865809410 e82fda53634b410e910c801bc1b00db2 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:38:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1003: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:26 compute-0 nova_compute[241566]: 2025-12-03 21:38:26.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:27 compute-0 nova_compute[241566]: 2025-12-03 21:38:27.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:27 compute-0 nova_compute[241566]: 2025-12-03 21:38:27.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:38:27 compute-0 ceph-mon[75204]: pgmap v1003: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:38:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:38:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1004: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:28 compute-0 nova_compute[241566]: 2025-12-03 21:38:28.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:29 compute-0 nova_compute[241566]: 2025-12-03 21:38:29.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:38:29 compute-0 nova_compute[241566]: 2025-12-03 21:38:29.596 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:38:29 compute-0 nova_compute[241566]: 2025-12-03 21:38:29.597 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:38:29 compute-0 nova_compute[241566]: 2025-12-03 21:38:29.597 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:38:29 compute-0 nova_compute[241566]: 2025-12-03 21:38:29.597 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:38:29 compute-0 nova_compute[241566]: 2025-12-03 21:38:29.598 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:38:29 compute-0 ceph-mon[75204]: pgmap v1004: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:38:30 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2384620290' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:38:30 compute-0 nova_compute[241566]: 2025-12-03 21:38:30.159 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.561s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:38:30 compute-0 nova_compute[241566]: 2025-12-03 21:38:30.382 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:38:30 compute-0 nova_compute[241566]: 2025-12-03 21:38:30.384 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5155MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:38:30 compute-0 nova_compute[241566]: 2025-12-03 21:38:30.385 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:38:30 compute-0 nova_compute[241566]: 2025-12-03 21:38:30.385 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:38:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1005: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:30 compute-0 nova_compute[241566]: 2025-12-03 21:38:30.478 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:38:30 compute-0 nova_compute[241566]: 2025-12-03 21:38:30.478 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:38:30 compute-0 nova_compute[241566]: 2025-12-03 21:38:30.493 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:38:30 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2384620290' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:38:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:38:31 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/199793592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:38:31 compute-0 nova_compute[241566]: 2025-12-03 21:38:31.081 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.587s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:38:31 compute-0 nova_compute[241566]: 2025-12-03 21:38:31.087 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:38:31 compute-0 nova_compute[241566]: 2025-12-03 21:38:31.105 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:38:31 compute-0 nova_compute[241566]: 2025-12-03 21:38:31.107 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:38:31 compute-0 nova_compute[241566]: 2025-12-03 21:38:31.107 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:38:31 compute-0 ceph-mon[75204]: pgmap v1005: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:31 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/199793592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:38:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1006: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:33 compute-0 podman[257628]: 2025-12-03 21:38:33.161327863 +0000 UTC m=+0.100155010 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 03 21:38:33 compute-0 ceph-mon[75204]: pgmap v1006: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:34 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:38:34.186 151937 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '4a:b3:fa', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '92:85:3a:67:f5:74'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 03 21:38:34 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:38:34.188 151937 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 03 21:38:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1007: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:35 compute-0 ceph-mon[75204]: pgmap v1007: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1008: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:37 compute-0 ceph-mon[75204]: pgmap v1008: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1009: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:39 compute-0 ceph-mon[75204]: pgmap v1009: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1010: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:41 compute-0 ceph-mon[75204]: pgmap v1010: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:42 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:38:42.190 151937 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f27c01e7-5b62-4209-a664-3ae50b74644d, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 03 21:38:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1011: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:43 compute-0 ceph-mon[75204]: pgmap v1011: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1012: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:45 compute-0 ceph-mon[75204]: pgmap v1012: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1013: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:47 compute-0 podman[257655]: 2025-12-03 21:38:47.13452441 +0000 UTC m=+0.074587393 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:38:47 compute-0 podman[257654]: 2025-12-03 21:38:47.143348108 +0000 UTC m=+0.082427885 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:38:47 compute-0 ceph-mon[75204]: pgmap v1013: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1014: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:38:48.943 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:38:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:38:48.944 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:38:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:38:48.944 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:38:49 compute-0 ceph-mon[75204]: pgmap v1014: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1015: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:38:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:38:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:38:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:38:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:38:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:38:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:51 compute-0 ceph-mon[75204]: pgmap v1015: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1016: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:53 compute-0 ceph-mon[75204]: pgmap v1016: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:54 compute-0 sudo[257693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:38:54 compute-0 sudo[257693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:54 compute-0 sudo[257693]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:54 compute-0 sudo[257718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:38:54 compute-0 sudo[257718]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1017: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:54 compute-0 sudo[257718]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:55 compute-0 sudo[257774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:38:55 compute-0 sudo[257774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:55 compute-0 sudo[257774]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:55 compute-0 sudo[257799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 list-networks
Dec 03 21:38:55 compute-0 sudo[257799]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:55 compute-0 sudo[257799]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:38:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:38:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:38:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:38:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:38:55 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:38:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:38:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:38:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:38:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:38:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:38:55 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:38:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:38:55 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:38:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:38:55 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:38:55 compute-0 sudo[257843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:38:55 compute-0 sudo[257843]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:55 compute-0 sudo[257843]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:55 compute-0 sudo[257868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:38:55 compute-0 sudo[257868]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:55 compute-0 podman[257907]: 2025-12-03 21:38:55.929153419 +0000 UTC m=+0.059915350 container create 0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_hertz, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:38:55 compute-0 systemd[1]: Started libpod-conmon-0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20.scope.
Dec 03 21:38:56 compute-0 podman[257907]: 2025-12-03 21:38:55.90833418 +0000 UTC m=+0.039096141 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:38:56 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:38:56 compute-0 podman[257907]: 2025-12-03 21:38:56.044205538 +0000 UTC m=+0.174967529 container init 0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_hertz, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:38:56 compute-0 podman[257907]: 2025-12-03 21:38:56.061222995 +0000 UTC m=+0.191984936 container start 0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_hertz, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, OSD_FLAVOR=default)
Dec 03 21:38:56 compute-0 podman[257907]: 2025-12-03 21:38:56.065701286 +0000 UTC m=+0.196463307 container attach 0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_hertz, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:38:56 compute-0 reverent_hertz[257923]: 167 167
Dec 03 21:38:56 compute-0 systemd[1]: libpod-0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20.scope: Deactivated successfully.
Dec 03 21:38:56 compute-0 podman[257907]: 2025-12-03 21:38:56.076420993 +0000 UTC m=+0.207182944 container died 0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_hertz, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:38:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-ed72e68c72f1aa9b7e9316c26a005d221199c1168a428ada2e2fe36fd3269950-merged.mount: Deactivated successfully.
Dec 03 21:38:56 compute-0 podman[257907]: 2025-12-03 21:38:56.126851217 +0000 UTC m=+0.257613128 container remove 0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=reverent_hertz, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:38:56 compute-0 systemd[1]: libpod-conmon-0bef449bc320516987948b257f863cc6d5d2d4c05cb0d4b67f3be0aed8ef4a20.scope: Deactivated successfully.
Dec 03 21:38:56 compute-0 podman[257945]: 2025-12-03 21:38:56.369910153 +0000 UTC m=+0.069273661 container create 58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=condescending_chandrasekhar, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:38:56 compute-0 ceph-mon[75204]: pgmap v1017: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:38:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:38:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:38:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:38:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:38:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:38:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:38:56 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:38:56 compute-0 systemd[1]: Started libpod-conmon-58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d.scope.
Dec 03 21:38:56 compute-0 podman[257945]: 2025-12-03 21:38:56.340874364 +0000 UTC m=+0.040237912 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:38:56 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:38:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfd02a2f384bb37632d0a7b4ade61aa1e0e99447a0146eecd81d330e1c6aa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfd02a2f384bb37632d0a7b4ade61aa1e0e99447a0146eecd81d330e1c6aa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfd02a2f384bb37632d0a7b4ade61aa1e0e99447a0146eecd81d330e1c6aa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfd02a2f384bb37632d0a7b4ade61aa1e0e99447a0146eecd81d330e1c6aa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbbcfd02a2f384bb37632d0a7b4ade61aa1e0e99447a0146eecd81d330e1c6aa/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:56 compute-0 podman[257945]: 2025-12-03 21:38:56.477453551 +0000 UTC m=+0.176817099 container init 58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=condescending_chandrasekhar, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:38:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1018: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:56 compute-0 podman[257945]: 2025-12-03 21:38:56.489436752 +0000 UTC m=+0.188800250 container start 58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=condescending_chandrasekhar, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True)
Dec 03 21:38:56 compute-0 podman[257945]: 2025-12-03 21:38:56.495835054 +0000 UTC m=+0.195198562 container attach 58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=condescending_chandrasekhar, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 03 21:38:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:38:57 compute-0 condescending_chandrasekhar[257961]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:38:57 compute-0 condescending_chandrasekhar[257961]: --> All data devices are unavailable
Dec 03 21:38:57 compute-0 systemd[1]: libpod-58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d.scope: Deactivated successfully.
Dec 03 21:38:57 compute-0 podman[257945]: 2025-12-03 21:38:57.136732752 +0000 UTC m=+0.836096270 container died 58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=condescending_chandrasekhar, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:38:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-bbbcfd02a2f384bb37632d0a7b4ade61aa1e0e99447a0146eecd81d330e1c6aa-merged.mount: Deactivated successfully.
Dec 03 21:38:57 compute-0 podman[257945]: 2025-12-03 21:38:57.19775363 +0000 UTC m=+0.897117138 container remove 58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=condescending_chandrasekhar, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:38:57 compute-0 systemd[1]: libpod-conmon-58dddf7247e7c4a6e294d4dc19233c8bc1cfc8ce8e0ac1cf4f6d1e460ca9193d.scope: Deactivated successfully.
Dec 03 21:38:57 compute-0 sudo[257868]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:57 compute-0 sudo[257993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:38:57 compute-0 sudo[257993]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:57 compute-0 sudo[257993]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:57 compute-0 sudo[258018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:38:57 compute-0 sudo[258018]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:57 compute-0 podman[258057]: 2025-12-03 21:38:57.826126261 +0000 UTC m=+0.074293196 container create 068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_borg, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:38:57 compute-0 systemd[1]: Started libpod-conmon-068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa.scope.
Dec 03 21:38:57 compute-0 podman[258057]: 2025-12-03 21:38:57.797895813 +0000 UTC m=+0.046062768 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:38:57 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:38:57 compute-0 podman[258057]: 2025-12-03 21:38:57.906441598 +0000 UTC m=+0.154608603 container init 068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_borg, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:38:57 compute-0 podman[258057]: 2025-12-03 21:38:57.913667941 +0000 UTC m=+0.161834886 container start 068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_borg, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:38:57 compute-0 podman[258057]: 2025-12-03 21:38:57.918268075 +0000 UTC m=+0.166435070 container attach 068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_borg, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:38:57 compute-0 peaceful_borg[258074]: 167 167
Dec 03 21:38:57 compute-0 systemd[1]: libpod-068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa.scope: Deactivated successfully.
Dec 03 21:38:57 compute-0 podman[258057]: 2025-12-03 21:38:57.920610488 +0000 UTC m=+0.168777443 container died 068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_borg, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3)
Dec 03 21:38:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-64aacfb9dd8213ee57025e4c38f86d1d7ae313697e6c0f485a17313c2d2aaa56-merged.mount: Deactivated successfully.
Dec 03 21:38:57 compute-0 podman[258057]: 2025-12-03 21:38:57.970153898 +0000 UTC m=+0.218320843 container remove 068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_borg, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:38:57 compute-0 systemd[1]: libpod-conmon-068f3106c10b4ba9dee95ad363130e2c88afd1a5fac2a7427ac23edc98522efa.scope: Deactivated successfully.
Dec 03 21:38:58 compute-0 podman[258096]: 2025-12-03 21:38:58.236269353 +0000 UTC m=+0.073890755 container create fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_euler, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Dec 03 21:38:58 compute-0 podman[258096]: 2025-12-03 21:38:58.206661268 +0000 UTC m=+0.044282730 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:38:58 compute-0 systemd[1]: Started libpod-conmon-fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6.scope.
Dec 03 21:38:58 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:38:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/748b8002f515684ae9c2386e688941354e548bc84cfc86bb0410313373e1f9c1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/748b8002f515684ae9c2386e688941354e548bc84cfc86bb0410313373e1f9c1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/748b8002f515684ae9c2386e688941354e548bc84cfc86bb0410313373e1f9c1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/748b8002f515684ae9c2386e688941354e548bc84cfc86bb0410313373e1f9c1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:58 compute-0 podman[258096]: 2025-12-03 21:38:58.382858819 +0000 UTC m=+0.220480281 container init fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_euler, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:38:58 compute-0 podman[258096]: 2025-12-03 21:38:58.395348334 +0000 UTC m=+0.232969736 container start fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_euler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:38:58 compute-0 podman[258096]: 2025-12-03 21:38:58.399490296 +0000 UTC m=+0.237111688 container attach fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_euler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:38:58 compute-0 ceph-mon[75204]: pgmap v1018: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1019: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:38:58 compute-0 charming_euler[258112]: {
Dec 03 21:38:58 compute-0 charming_euler[258112]:     "0": [
Dec 03 21:38:58 compute-0 charming_euler[258112]:         {
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "devices": [
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "/dev/loop3"
Dec 03 21:38:58 compute-0 charming_euler[258112]:             ],
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_name": "ceph_lv0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_size": "21470642176",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "name": "ceph_lv0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "tags": {
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cluster_name": "ceph",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.crush_device_class": "",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.encrypted": "0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.objectstore": "bluestore",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osd_id": "0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.type": "block",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.vdo": "0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.with_tpm": "0"
Dec 03 21:38:58 compute-0 charming_euler[258112]:             },
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "type": "block",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "vg_name": "ceph_vg0"
Dec 03 21:38:58 compute-0 charming_euler[258112]:         }
Dec 03 21:38:58 compute-0 charming_euler[258112]:     ],
Dec 03 21:38:58 compute-0 charming_euler[258112]:     "1": [
Dec 03 21:38:58 compute-0 charming_euler[258112]:         {
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "devices": [
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "/dev/loop4"
Dec 03 21:38:58 compute-0 charming_euler[258112]:             ],
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_name": "ceph_lv1",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_size": "21470642176",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "name": "ceph_lv1",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "tags": {
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cluster_name": "ceph",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.crush_device_class": "",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.encrypted": "0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.objectstore": "bluestore",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osd_id": "1",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.type": "block",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.vdo": "0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.with_tpm": "0"
Dec 03 21:38:58 compute-0 charming_euler[258112]:             },
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "type": "block",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "vg_name": "ceph_vg1"
Dec 03 21:38:58 compute-0 charming_euler[258112]:         }
Dec 03 21:38:58 compute-0 charming_euler[258112]:     ],
Dec 03 21:38:58 compute-0 charming_euler[258112]:     "2": [
Dec 03 21:38:58 compute-0 charming_euler[258112]:         {
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "devices": [
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "/dev/loop5"
Dec 03 21:38:58 compute-0 charming_euler[258112]:             ],
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_name": "ceph_lv2",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_size": "21470642176",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "name": "ceph_lv2",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "tags": {
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.cluster_name": "ceph",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.crush_device_class": "",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.encrypted": "0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.objectstore": "bluestore",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osd_id": "2",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.type": "block",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.vdo": "0",
Dec 03 21:38:58 compute-0 charming_euler[258112]:                 "ceph.with_tpm": "0"
Dec 03 21:38:58 compute-0 charming_euler[258112]:             },
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "type": "block",
Dec 03 21:38:58 compute-0 charming_euler[258112]:             "vg_name": "ceph_vg2"
Dec 03 21:38:58 compute-0 charming_euler[258112]:         }
Dec 03 21:38:58 compute-0 charming_euler[258112]:     ]
Dec 03 21:38:58 compute-0 charming_euler[258112]: }
Dec 03 21:38:58 compute-0 systemd[1]: libpod-fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6.scope: Deactivated successfully.
Dec 03 21:38:58 compute-0 podman[258096]: 2025-12-03 21:38:58.720271248 +0000 UTC m=+0.557892620 container died fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_euler, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:38:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-748b8002f515684ae9c2386e688941354e548bc84cfc86bb0410313373e1f9c1-merged.mount: Deactivated successfully.
Dec 03 21:38:58 compute-0 podman[258096]: 2025-12-03 21:38:58.777242427 +0000 UTC m=+0.614863829 container remove fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=charming_euler, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:38:58 compute-0 systemd[1]: libpod-conmon-fb59cd30b09191e491cb68555bb53b97cc49a6420e04f281abd9667ab72ba2f6.scope: Deactivated successfully.
Dec 03 21:38:58 compute-0 sudo[258018]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:58 compute-0 sudo[258132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:38:58 compute-0 sudo[258132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:58 compute-0 sudo[258132]: pam_unix(sudo:session): session closed for user root
Dec 03 21:38:59 compute-0 sudo[258157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:38:59 compute-0 sudo[258157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:38:59 compute-0 podman[258193]: 2025-12-03 21:38:59.350451108 +0000 UTC m=+0.058393749 container create 48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=happy_shaw, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:38:59 compute-0 systemd[1]: Started libpod-conmon-48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d.scope.
Dec 03 21:38:59 compute-0 podman[258193]: 2025-12-03 21:38:59.323698 +0000 UTC m=+0.031640701 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:38:59 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:38:59 compute-0 podman[258193]: 2025-12-03 21:38:59.441323588 +0000 UTC m=+0.149266219 container init 48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=happy_shaw, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030)
Dec 03 21:38:59 compute-0 podman[258193]: 2025-12-03 21:38:59.451771848 +0000 UTC m=+0.159714459 container start 48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=happy_shaw, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:38:59 compute-0 podman[258193]: 2025-12-03 21:38:59.455719385 +0000 UTC m=+0.163662016 container attach 48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=happy_shaw, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle)
Dec 03 21:38:59 compute-0 happy_shaw[258210]: 167 167
Dec 03 21:38:59 compute-0 systemd[1]: libpod-48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d.scope: Deactivated successfully.
Dec 03 21:38:59 compute-0 podman[258193]: 2025-12-03 21:38:59.45890941 +0000 UTC m=+0.166852051 container died 48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=happy_shaw, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3)
Dec 03 21:38:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-473f79dd126a0b61cdea17267b14308abf7df11add406a09d1a2816e49fd19d5-merged.mount: Deactivated successfully.
Dec 03 21:38:59 compute-0 podman[258193]: 2025-12-03 21:38:59.510476734 +0000 UTC m=+0.218419365 container remove 48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=happy_shaw, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:38:59 compute-0 systemd[1]: libpod-conmon-48acbd99009f3aeb1b0354d8a08e19f17cbf94ea6051ca69ac6df5fe57e9e48d.scope: Deactivated successfully.
Dec 03 21:38:59 compute-0 podman[258232]: 2025-12-03 21:38:59.71142042 +0000 UTC m=+0.067574426 container create 5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=lucid_mendeleev, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:38:59 compute-0 podman[258232]: 2025-12-03 21:38:59.684229039 +0000 UTC m=+0.040383155 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:38:59 compute-0 systemd[1]: Started libpod-conmon-5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc.scope.
Dec 03 21:38:59 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:38:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aefc3d05ec477f865a1bd33363177a77ad3925a164d72a72a037659a3d4ad008/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aefc3d05ec477f865a1bd33363177a77ad3925a164d72a72a037659a3d4ad008/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aefc3d05ec477f865a1bd33363177a77ad3925a164d72a72a037659a3d4ad008/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aefc3d05ec477f865a1bd33363177a77ad3925a164d72a72a037659a3d4ad008/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:38:59 compute-0 podman[258232]: 2025-12-03 21:38:59.86004814 +0000 UTC m=+0.216202226 container init 5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=lucid_mendeleev, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:38:59 compute-0 podman[258232]: 2025-12-03 21:38:59.869481763 +0000 UTC m=+0.225635799 container start 5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=lucid_mendeleev, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:38:59 compute-0 podman[258232]: 2025-12-03 21:38:59.874473567 +0000 UTC m=+0.230627653 container attach 5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=lucid_mendeleev, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:38:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:38:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3829623944' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:38:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:38:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3829623944' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:39:00 compute-0 ceph-mon[75204]: pgmap v1019: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3829623944' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:39:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3829623944' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:39:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1020: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:00 compute-0 lvm[258328]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:39:00 compute-0 lvm[258327]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:39:00 compute-0 lvm[258330]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:39:00 compute-0 lvm[258328]: VG ceph_vg1 finished
Dec 03 21:39:00 compute-0 lvm[258327]: VG ceph_vg0 finished
Dec 03 21:39:00 compute-0 lvm[258330]: VG ceph_vg2 finished
Dec 03 21:39:00 compute-0 lvm[258332]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:39:00 compute-0 lvm[258332]: VG ceph_vg2 finished
Dec 03 21:39:00 compute-0 lucid_mendeleev[258249]: {}
Dec 03 21:39:00 compute-0 systemd[1]: libpod-5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc.scope: Deactivated successfully.
Dec 03 21:39:00 compute-0 systemd[1]: libpod-5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc.scope: Consumed 1.387s CPU time.
Dec 03 21:39:00 compute-0 podman[258232]: 2025-12-03 21:39:00.766500468 +0000 UTC m=+1.122654514 container died 5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=lucid_mendeleev, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:39:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-aefc3d05ec477f865a1bd33363177a77ad3925a164d72a72a037659a3d4ad008-merged.mount: Deactivated successfully.
Dec 03 21:39:00 compute-0 podman[258232]: 2025-12-03 21:39:00.819907392 +0000 UTC m=+1.176061418 container remove 5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=lucid_mendeleev, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, OSD_FLAVOR=default)
Dec 03 21:39:00 compute-0 systemd[1]: libpod-conmon-5ecdff2123c0f11fbbc63c6200ace8b759bede935808a098163abd94027d6ffc.scope: Deactivated successfully.
Dec 03 21:39:00 compute-0 sudo[258157]: pam_unix(sudo:session): session closed for user root
Dec 03 21:39:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:39:00 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:39:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:39:00 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:39:00 compute-0 sudo[258344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:39:00 compute-0 sudo[258344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:39:00 compute-0 sudo[258344]: pam_unix(sudo:session): session closed for user root
Dec 03 21:39:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:01 compute-0 ceph-mon[75204]: pgmap v1020: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:39:01 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:39:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1021: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:03 compute-0 ceph-mon[75204]: pgmap v1021: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:04 compute-0 podman[258369]: 2025-12-03 21:39:04.209739206 +0000 UTC m=+0.144238824 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 03 21:39:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1022: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:05 compute-0 ceph-mon[75204]: pgmap v1022: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1023: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:07 compute-0 ceph-mon[75204]: pgmap v1023: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1024: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:09 compute-0 ceph-mon[75204]: pgmap v1024: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:39:09 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 5730 writes, 23K keys, 5730 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 5730 writes, 1063 syncs, 5.39 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1357 writes, 3840 keys, 1357 commit groups, 1.0 writes per commit group, ingest: 2.14 MB, 0.00 MB/s
                                           Interval WAL: 1357 writes, 612 syncs, 2.22 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 03 21:39:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1025: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:11 compute-0 ceph-mon[75204]: pgmap v1025: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1026: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:13 compute-0 ceph-mon[75204]: pgmap v1026: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1027: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:39:14 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 6189 writes, 25K keys, 6189 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 6189 writes, 1252 syncs, 4.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1674 writes, 4716 keys, 1674 commit groups, 1.0 writes per commit group, ingest: 2.71 MB, 0.00 MB/s
                                           Interval WAL: 1674 writes, 747 syncs, 2.24 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 03 21:39:15 compute-0 ceph-mon[75204]: pgmap v1027: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1028: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:17 compute-0 ceph-mon[75204]: pgmap v1028: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:18 compute-0 podman[258397]: 2025-12-03 21:39:18.162832883 +0000 UTC m=+0.086547363 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 21:39:18 compute-0 podman[258396]: 2025-12-03 21:39:18.169743869 +0000 UTC m=+0.099629526 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:39:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1029: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:39:19 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 5828 writes, 23K keys, 5828 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 5828 writes, 1121 syncs, 5.20 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1678 writes, 4312 keys, 1678 commit groups, 1.0 writes per commit group, ingest: 2.35 MB, 0.00 MB/s
                                           Interval WAL: 1678 writes, 755 syncs, 2.22 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 03 21:39:19 compute-0 ceph-mon[75204]: pgmap v1029: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1030: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:39:21
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'images', 'cephfs.cephfs.data', 'backups', 'volumes', '.mgr', 'vms']
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [devicehealth INFO root] Check health
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:39:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:21 compute-0 ceph-mon[75204]: pgmap v1030: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:39:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:39:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1031: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:23 compute-0 ceph-mon[75204]: pgmap v1031: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:24 compute-0 nova_compute[241566]: 2025-12-03 21:39:24.102 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1032: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:24 compute-0 nova_compute[241566]: 2025-12-03 21:39:24.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:24 compute-0 nova_compute[241566]: 2025-12-03 21:39:24.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:39:24 compute-0 nova_compute[241566]: 2025-12-03 21:39:24.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:39:24 compute-0 nova_compute[241566]: 2025-12-03 21:39:24.572 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:39:25 compute-0 nova_compute[241566]: 2025-12-03 21:39:25.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:25 compute-0 nova_compute[241566]: 2025-12-03 21:39:25.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:25 compute-0 ceph-mon[75204]: pgmap v1032: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1033: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:26 compute-0 nova_compute[241566]: 2025-12-03 21:39:26.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:27 compute-0 nova_compute[241566]: 2025-12-03 21:39:27.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:39:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:39:27 compute-0 ceph-mon[75204]: pgmap v1033: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1034: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:28 compute-0 nova_compute[241566]: 2025-12-03 21:39:28.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:28 compute-0 nova_compute[241566]: 2025-12-03 21:39:28.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:28 compute-0 nova_compute[241566]: 2025-12-03 21:39:28.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:39:29 compute-0 ceph-mon[75204]: pgmap v1034: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1035: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:30 compute-0 nova_compute[241566]: 2025-12-03 21:39:30.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:30 compute-0 nova_compute[241566]: 2025-12-03 21:39:30.593 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:39:30 compute-0 nova_compute[241566]: 2025-12-03 21:39:30.593 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:39:30 compute-0 nova_compute[241566]: 2025-12-03 21:39:30.594 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:39:30 compute-0 nova_compute[241566]: 2025-12-03 21:39:30.594 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:39:30 compute-0 nova_compute[241566]: 2025-12-03 21:39:30.595 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:39:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:39:31 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2628005888' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:39:31 compute-0 nova_compute[241566]: 2025-12-03 21:39:31.185 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.590s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:39:31 compute-0 nova_compute[241566]: 2025-12-03 21:39:31.459 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:39:31 compute-0 nova_compute[241566]: 2025-12-03 21:39:31.461 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5132MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:39:31 compute-0 nova_compute[241566]: 2025-12-03 21:39:31.461 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:39:31 compute-0 nova_compute[241566]: 2025-12-03 21:39:31.462 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:39:31 compute-0 nova_compute[241566]: 2025-12-03 21:39:31.551 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:39:31 compute-0 nova_compute[241566]: 2025-12-03 21:39:31.551 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:39:31 compute-0 nova_compute[241566]: 2025-12-03 21:39:31.580 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:39:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:31 compute-0 ceph-mon[75204]: pgmap v1035: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:31 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2628005888' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:39:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:39:32 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3593444517' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:39:32 compute-0 nova_compute[241566]: 2025-12-03 21:39:32.156 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.577s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:39:32 compute-0 nova_compute[241566]: 2025-12-03 21:39:32.166 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:39:32 compute-0 nova_compute[241566]: 2025-12-03 21:39:32.191 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:39:32 compute-0 nova_compute[241566]: 2025-12-03 21:39:32.194 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:39:32 compute-0 nova_compute[241566]: 2025-12-03 21:39:32.194 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:39:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1036: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:32 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3593444517' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:39:33 compute-0 ceph-mon[75204]: pgmap v1036: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1037: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:35 compute-0 podman[258477]: 2025-12-03 21:39:35.250405991 +0000 UTC m=+0.182574942 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:39:35 compute-0 ceph-mon[75204]: pgmap v1037: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:36 compute-0 nova_compute[241566]: 2025-12-03 21:39:36.189 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:39:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1038: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:37 compute-0 ceph-mon[75204]: pgmap v1038: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1039: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:39 compute-0 ceph-mon[75204]: pgmap v1039: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1040: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:41 compute-0 ceph-mon[75204]: pgmap v1040: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1041: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:44 compute-0 ceph-mon[75204]: pgmap v1041: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1042: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:46 compute-0 ceph-mon[75204]: pgmap v1042: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1043: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:48 compute-0 ceph-mon[75204]: pgmap v1043: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1044: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:39:48.944 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:39:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:39:48.945 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:39:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:39:48.945 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:39:49 compute-0 podman[258504]: 2025-12-03 21:39:49.159842528 +0000 UTC m=+0.088026524 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:39:49 compute-0 podman[258505]: 2025-12-03 21:39:49.184093399 +0000 UTC m=+0.083408061 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 03 21:39:50 compute-0 ceph-mon[75204]: pgmap v1044: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1045: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:39:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:39:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:39:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:39:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:39:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:39:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:52 compute-0 ceph-mon[75204]: pgmap v1045: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1046: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:54 compute-0 ceph-mon[75204]: pgmap v1046: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1047: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:56 compute-0 ceph-mon[75204]: pgmap v1047: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1048: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:39:58 compute-0 ceph-mon[75204]: pgmap v1048: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1049: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:39:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:39:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3985118692' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:39:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:39:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3985118692' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:40:00 compute-0 ceph-mon[75204]: pgmap v1049: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3985118692' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:40:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3985118692' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:40:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1050: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:01 compute-0 sudo[258544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:40:01 compute-0 sudo[258544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:01 compute-0 sudo[258544]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:01 compute-0 sudo[258569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 check-host
Dec 03 21:40:01 compute-0 sudo[258569]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:01 compute-0 sudo[258569]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:40:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:40:01 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:01 compute-0 sudo[258614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:40:01 compute-0 sudo[258614]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:01 compute-0 sudo[258614]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:01 compute-0 sudo[258639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:40:01 compute-0 sudo[258639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:02 compute-0 ceph-mon[75204]: pgmap v1050: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:02 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:02 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:02 compute-0 sudo[258639]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1051: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:02 compute-0 sudo[258696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:40:02 compute-0 sudo[258696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:02 compute-0 sudo[258696]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:02 compute-0 sudo[258721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- inventory --format=json-pretty --filter-for-batch
Dec 03 21:40:02 compute-0 sudo[258721]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:03 compute-0 podman[258756]: 2025-12-03 21:40:03.026131797 +0000 UTC m=+0.067355460 container create 8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_babbage, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030)
Dec 03 21:40:03 compute-0 systemd[1]: Started libpod-conmon-8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d.scope.
Dec 03 21:40:03 compute-0 podman[258756]: 2025-12-03 21:40:02.997145419 +0000 UTC m=+0.038369092 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:40:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:40:03 compute-0 podman[258756]: 2025-12-03 21:40:03.129646486 +0000 UTC m=+0.170870169 container init 8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_babbage, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:40:03 compute-0 podman[258756]: 2025-12-03 21:40:03.141006191 +0000 UTC m=+0.182229864 container start 8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_babbage, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:40:03 compute-0 podman[258756]: 2025-12-03 21:40:03.145151613 +0000 UTC m=+0.186375346 container attach 8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_babbage, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:40:03 compute-0 confident_babbage[258772]: 167 167
Dec 03 21:40:03 compute-0 systemd[1]: libpod-8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d.scope: Deactivated successfully.
Dec 03 21:40:03 compute-0 podman[258756]: 2025-12-03 21:40:03.149598531 +0000 UTC m=+0.190822194 container died 8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_babbage, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 03 21:40:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-8e071b05d4d70a31427d70d56c008b4e5c1d060f361cf3f86565be6a0fe38202-merged.mount: Deactivated successfully.
Dec 03 21:40:03 compute-0 podman[258756]: 2025-12-03 21:40:03.207349242 +0000 UTC m=+0.248572915 container remove 8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=confident_babbage, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:40:03 compute-0 systemd[1]: libpod-conmon-8b806e6254b602ed1ed44ee462e8e1fe6937d61f6c9e3acbe757f8c172c84e9d.scope: Deactivated successfully.
Dec 03 21:40:03 compute-0 podman[258798]: 2025-12-03 21:40:03.451244171 +0000 UTC m=+0.065958183 container create f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_wiles, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:40:03 compute-0 systemd[1]: Started libpod-conmon-f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09.scope.
Dec 03 21:40:03 compute-0 podman[258798]: 2025-12-03 21:40:03.429052995 +0000 UTC m=+0.043766987 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:40:03 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:40:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/765ab9807ebabbcaee1f29d5db33f2310264c8e5bced45f2725ddec9fdbe6971/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/765ab9807ebabbcaee1f29d5db33f2310264c8e5bced45f2725ddec9fdbe6971/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/765ab9807ebabbcaee1f29d5db33f2310264c8e5bced45f2725ddec9fdbe6971/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/765ab9807ebabbcaee1f29d5db33f2310264c8e5bced45f2725ddec9fdbe6971/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:03 compute-0 podman[258798]: 2025-12-03 21:40:03.567505462 +0000 UTC m=+0.182219524 container init f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_wiles, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030)
Dec 03 21:40:03 compute-0 podman[258798]: 2025-12-03 21:40:03.579087624 +0000 UTC m=+0.193801636 container start f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_wiles, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:40:03 compute-0 podman[258798]: 2025-12-03 21:40:03.583527752 +0000 UTC m=+0.198241784 container attach f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_wiles, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: pgmap v1051: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]: [
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:     {
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "available": false,
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "being_replaced": false,
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "ceph_device_lvm": false,
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "lsm_data": {},
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "lvs": [],
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "path": "/dev/sr0",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "rejected_reasons": [
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "Insufficient space (<5GB)",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "Has a FileSystem"
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         ],
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         "sys_api": {
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "actuators": null,
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "device_nodes": [
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:                 "sr0"
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             ],
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "devname": "sr0",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "human_readable_size": "482.00 KB",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "id_bus": "ata",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "model": "QEMU DVD-ROM",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "nr_requests": "2",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "parent": "/dev/sr0",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "partitions": {},
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "path": "/dev/sr0",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "removable": "1",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "rev": "2.5+",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "ro": "0",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "rotational": "1",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "sas_address": "",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "sas_device_handle": "",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "scheduler_mode": "mq-deadline",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "sectors": 0,
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "sectorsize": "2048",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "size": 493568.0,
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "support_discard": "2048",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "type": "disk",
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:             "vendor": "QEMU"
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:         }
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]:     }
Dec 03 21:40:04 compute-0 flamboyant_wiles[258814]: ]
Dec 03 21:40:04 compute-0 systemd[1]: libpod-f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09.scope: Deactivated successfully.
Dec 03 21:40:04 compute-0 podman[258798]: 2025-12-03 21:40:04.156508717 +0000 UTC m=+0.771222719 container died f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_wiles, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3)
Dec 03 21:40:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-765ab9807ebabbcaee1f29d5db33f2310264c8e5bced45f2725ddec9fdbe6971-merged.mount: Deactivated successfully.
Dec 03 21:40:04 compute-0 podman[258798]: 2025-12-03 21:40:04.213392484 +0000 UTC m=+0.828106456 container remove f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=flamboyant_wiles, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:40:04 compute-0 systemd[1]: libpod-conmon-f3fcf45952b32c8f596ff0205a7bf3f745312f36d127e26a8fa64cf2f67c4c09.scope: Deactivated successfully.
Dec 03 21:40:04 compute-0 sudo[258721]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} : dispatch
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:40:04 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:40:04 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:40:04 compute-0 sudo[259567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:40:04 compute-0 sudo[259567]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:04 compute-0 sudo[259567]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:04 compute-0 sudo[259592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:40:04 compute-0 sudo[259592]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1052: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:04 compute-0 podman[259628]: 2025-12-03 21:40:04.871400331 +0000 UTC m=+0.076381702 container create 9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_dewdney, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:40:04 compute-0 systemd[1]: Started libpod-conmon-9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b.scope.
Dec 03 21:40:04 compute-0 podman[259628]: 2025-12-03 21:40:04.841125158 +0000 UTC m=+0.046106569 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:40:04 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:40:04 compute-0 podman[259628]: 2025-12-03 21:40:04.969454533 +0000 UTC m=+0.174435924 container init 9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_dewdney, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:40:04 compute-0 podman[259628]: 2025-12-03 21:40:04.979274306 +0000 UTC m=+0.184255677 container start 9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_dewdney, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 03 21:40:04 compute-0 podman[259628]: 2025-12-03 21:40:04.983286065 +0000 UTC m=+0.188267476 container attach 9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_dewdney, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, ceph=True)
Dec 03 21:40:04 compute-0 naughty_dewdney[259644]: 167 167
Dec 03 21:40:04 compute-0 systemd[1]: libpod-9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b.scope: Deactivated successfully.
Dec 03 21:40:04 compute-0 podman[259628]: 2025-12-03 21:40:04.987274181 +0000 UTC m=+0.192255552 container died 9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_dewdney, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:40:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-62adb8949b483ff01644bfaf07b2668235ddd45edf8b539ed5889e17422d9168-merged.mount: Deactivated successfully.
Dec 03 21:40:05 compute-0 podman[259628]: 2025-12-03 21:40:05.043440839 +0000 UTC m=+0.248422200 container remove 9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_dewdney, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:40:05 compute-0 systemd[1]: libpod-conmon-9420cc8ecae1361acc3e437a963e91bbad8a539082d853538b87db9838b0025b.scope: Deactivated successfully.
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"} : dispatch
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:40:05 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:40:05 compute-0 podman[259668]: 2025-12-03 21:40:05.302821564 +0000 UTC m=+0.070495684 container create 988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_jemison, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:40:05 compute-0 systemd[1]: Started libpod-conmon-988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71.scope.
Dec 03 21:40:05 compute-0 podman[259668]: 2025-12-03 21:40:05.272319435 +0000 UTC m=+0.039993595 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:40:05 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:40:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8023176697cee9a01a0d55c6369ff0235ca846eb11a41dd424f0768faabcfeae/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8023176697cee9a01a0d55c6369ff0235ca846eb11a41dd424f0768faabcfeae/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8023176697cee9a01a0d55c6369ff0235ca846eb11a41dd424f0768faabcfeae/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8023176697cee9a01a0d55c6369ff0235ca846eb11a41dd424f0768faabcfeae/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8023176697cee9a01a0d55c6369ff0235ca846eb11a41dd424f0768faabcfeae/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:05 compute-0 podman[259668]: 2025-12-03 21:40:05.421603743 +0000 UTC m=+0.189277913 container init 988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_jemison, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:40:05 compute-0 podman[259668]: 2025-12-03 21:40:05.437592122 +0000 UTC m=+0.205266232 container start 988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_jemison, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Dec 03 21:40:05 compute-0 podman[259668]: 2025-12-03 21:40:05.441993961 +0000 UTC m=+0.209668101 container attach 988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_jemison, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True)
Dec 03 21:40:05 compute-0 podman[259682]: 2025-12-03 21:40:05.559843675 +0000 UTC m=+0.211548041 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:40:05 compute-0 peaceful_jemison[259685]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:40:05 compute-0 peaceful_jemison[259685]: --> All data devices are unavailable
Dec 03 21:40:06 compute-0 systemd[1]: libpod-988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71.scope: Deactivated successfully.
Dec 03 21:40:06 compute-0 podman[259730]: 2025-12-03 21:40:06.08819991 +0000 UTC m=+0.041135144 container died 988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_jemison, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 03 21:40:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-8023176697cee9a01a0d55c6369ff0235ca846eb11a41dd424f0768faabcfeae-merged.mount: Deactivated successfully.
Dec 03 21:40:06 compute-0 podman[259730]: 2025-12-03 21:40:06.149035874 +0000 UTC m=+0.101971058 container remove 988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=peaceful_jemison, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:40:06 compute-0 systemd[1]: libpod-conmon-988d30ac7afefb91ecd6979e93fb7058afb9a79378a33c1274c2de7c42730e71.scope: Deactivated successfully.
Dec 03 21:40:06 compute-0 sudo[259592]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:06 compute-0 sudo[259745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:40:06 compute-0 sudo[259745]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:06 compute-0 sudo[259745]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:06 compute-0 ceph-mon[75204]: pgmap v1052: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:06 compute-0 sudo[259770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:40:06 compute-0 sudo[259770]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1053: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:06 compute-0 podman[259809]: 2025-12-03 21:40:06.723395355 +0000 UTC m=+0.077214554 container create 6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jolly_dhawan, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030)
Dec 03 21:40:06 compute-0 systemd[1]: Started libpod-conmon-6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6.scope.
Dec 03 21:40:06 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:40:06 compute-0 podman[259809]: 2025-12-03 21:40:06.702906545 +0000 UTC m=+0.056725774 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:40:06 compute-0 podman[259809]: 2025-12-03 21:40:06.810866363 +0000 UTC m=+0.164685642 container init 6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jolly_dhawan, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 03 21:40:06 compute-0 podman[259809]: 2025-12-03 21:40:06.822131516 +0000 UTC m=+0.175950745 container start 6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jolly_dhawan, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:40:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:06 compute-0 podman[259809]: 2025-12-03 21:40:06.82675037 +0000 UTC m=+0.180569589 container attach 6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jolly_dhawan, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:40:06 compute-0 jolly_dhawan[259825]: 167 167
Dec 03 21:40:06 compute-0 systemd[1]: libpod-6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6.scope: Deactivated successfully.
Dec 03 21:40:06 compute-0 podman[259809]: 2025-12-03 21:40:06.831323222 +0000 UTC m=+0.185142451 container died 6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jolly_dhawan, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:40:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-d2448b85bf2696f9563ea83ab1e15c47ead65a1261e8a57b00afdb4d3e53eff9-merged.mount: Deactivated successfully.
Dec 03 21:40:06 compute-0 podman[259809]: 2025-12-03 21:40:06.884889571 +0000 UTC m=+0.238708800 container remove 6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=jolly_dhawan, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:40:06 compute-0 systemd[1]: libpod-conmon-6e92615f2c58e04fe85d5692ca3037893cded05f01f43a4c9ef3962bc74575a6.scope: Deactivated successfully.
Dec 03 21:40:07 compute-0 podman[259849]: 2025-12-03 21:40:07.140843583 +0000 UTC m=+0.064685407 container create b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rubin, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:40:07 compute-0 systemd[1]: Started libpod-conmon-b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434.scope.
Dec 03 21:40:07 compute-0 podman[259849]: 2025-12-03 21:40:07.11427715 +0000 UTC m=+0.038119024 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:40:07 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19a5b32ee8a617225352ab051ed29d7244005b639a8290455e2013b32a175f0c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19a5b32ee8a617225352ab051ed29d7244005b639a8290455e2013b32a175f0c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19a5b32ee8a617225352ab051ed29d7244005b639a8290455e2013b32a175f0c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19a5b32ee8a617225352ab051ed29d7244005b639a8290455e2013b32a175f0c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:07 compute-0 podman[259849]: 2025-12-03 21:40:07.260393773 +0000 UTC m=+0.184235647 container init b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rubin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:40:07 compute-0 podman[259849]: 2025-12-03 21:40:07.277268156 +0000 UTC m=+0.201109980 container start b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rubin, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20251030, OSD_FLAVOR=default)
Dec 03 21:40:07 compute-0 podman[259849]: 2025-12-03 21:40:07.281114439 +0000 UTC m=+0.204956253 container attach b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rubin, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 03 21:40:07 compute-0 cool_rubin[259865]: {
Dec 03 21:40:07 compute-0 cool_rubin[259865]:     "0": [
Dec 03 21:40:07 compute-0 cool_rubin[259865]:         {
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "devices": [
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "/dev/loop3"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             ],
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_name": "ceph_lv0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_size": "21470642176",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "name": "ceph_lv0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "tags": {
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cluster_name": "ceph",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.crush_device_class": "",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.encrypted": "0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.objectstore": "bluestore",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osd_id": "0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.type": "block",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.vdo": "0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.with_tpm": "0"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             },
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "type": "block",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "vg_name": "ceph_vg0"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:         }
Dec 03 21:40:07 compute-0 cool_rubin[259865]:     ],
Dec 03 21:40:07 compute-0 cool_rubin[259865]:     "1": [
Dec 03 21:40:07 compute-0 cool_rubin[259865]:         {
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "devices": [
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "/dev/loop4"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             ],
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_name": "ceph_lv1",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_size": "21470642176",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "name": "ceph_lv1",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "tags": {
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cluster_name": "ceph",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.crush_device_class": "",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.encrypted": "0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.objectstore": "bluestore",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osd_id": "1",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.type": "block",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.vdo": "0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.with_tpm": "0"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             },
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "type": "block",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "vg_name": "ceph_vg1"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:         }
Dec 03 21:40:07 compute-0 cool_rubin[259865]:     ],
Dec 03 21:40:07 compute-0 cool_rubin[259865]:     "2": [
Dec 03 21:40:07 compute-0 cool_rubin[259865]:         {
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "devices": [
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "/dev/loop5"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             ],
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_name": "ceph_lv2",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_size": "21470642176",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "name": "ceph_lv2",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "tags": {
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.cluster_name": "ceph",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.crush_device_class": "",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.encrypted": "0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.objectstore": "bluestore",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osd_id": "2",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.type": "block",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.vdo": "0",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:                 "ceph.with_tpm": "0"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             },
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "type": "block",
Dec 03 21:40:07 compute-0 cool_rubin[259865]:             "vg_name": "ceph_vg2"
Dec 03 21:40:07 compute-0 cool_rubin[259865]:         }
Dec 03 21:40:07 compute-0 cool_rubin[259865]:     ]
Dec 03 21:40:07 compute-0 cool_rubin[259865]: }
Dec 03 21:40:07 compute-0 systemd[1]: libpod-b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434.scope: Deactivated successfully.
Dec 03 21:40:07 compute-0 podman[259849]: 2025-12-03 21:40:07.655642535 +0000 UTC m=+0.579484389 container died b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rubin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:40:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-19a5b32ee8a617225352ab051ed29d7244005b639a8290455e2013b32a175f0c-merged.mount: Deactivated successfully.
Dec 03 21:40:07 compute-0 podman[259849]: 2025-12-03 21:40:07.70799274 +0000 UTC m=+0.631834564 container remove b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rubin, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030)
Dec 03 21:40:07 compute-0 systemd[1]: libpod-conmon-b013c2f44cca706b7d5c86b2d6a4bcb2f32162f90fa3025c249a4fd47d198434.scope: Deactivated successfully.
Dec 03 21:40:07 compute-0 sudo[259770]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:07 compute-0 sudo[259884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:40:07 compute-0 sudo[259884]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:07 compute-0 sudo[259884]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:07 compute-0 sudo[259909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:40:07 compute-0 sudo[259909]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:08 compute-0 podman[259946]: 2025-12-03 21:40:08.26292065 +0000 UTC m=+0.066241489 container create 22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_kapitsa, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:40:08 compute-0 systemd[1]: Started libpod-conmon-22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897.scope.
Dec 03 21:40:08 compute-0 ceph-mon[75204]: pgmap v1053: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:08 compute-0 podman[259946]: 2025-12-03 21:40:08.237393234 +0000 UTC m=+0.040714133 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:40:08 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:40:08 compute-0 podman[259946]: 2025-12-03 21:40:08.360810948 +0000 UTC m=+0.164131837 container init 22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_kapitsa, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:40:08 compute-0 podman[259946]: 2025-12-03 21:40:08.371874395 +0000 UTC m=+0.175195244 container start 22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_kapitsa, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:40:08 compute-0 podman[259946]: 2025-12-03 21:40:08.377318701 +0000 UTC m=+0.180639550 container attach 22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_kapitsa, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:40:08 compute-0 optimistic_kapitsa[259962]: 167 167
Dec 03 21:40:08 compute-0 systemd[1]: libpod-22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897.scope: Deactivated successfully.
Dec 03 21:40:08 compute-0 podman[259946]: 2025-12-03 21:40:08.380026374 +0000 UTC m=+0.183347213 container died 22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_kapitsa, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Dec 03 21:40:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-a751f788a6dfacf3c1cf03b2be8e567a344abc9f119f4f0f636a11b875dd5881-merged.mount: Deactivated successfully.
Dec 03 21:40:08 compute-0 podman[259946]: 2025-12-03 21:40:08.431425654 +0000 UTC m=+0.234746503 container remove 22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=optimistic_kapitsa, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:40:08 compute-0 systemd[1]: libpod-conmon-22e4b35787a535b674abcaf5f67cc255f14d94ce6ed4bb9feea2579740278897.scope: Deactivated successfully.
Dec 03 21:40:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1054: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:08 compute-0 podman[259986]: 2025-12-03 21:40:08.674763408 +0000 UTC m=+0.074474311 container create aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_haslett, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:40:08 compute-0 systemd[1]: Started libpod-conmon-aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316.scope.
Dec 03 21:40:08 compute-0 podman[259986]: 2025-12-03 21:40:08.642908202 +0000 UTC m=+0.042619115 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:40:08 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:40:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a17fd44a83bae8b84c089e3b78844d4b599816cb66ab8c64f60643483c710d6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a17fd44a83bae8b84c089e3b78844d4b599816cb66ab8c64f60643483c710d6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a17fd44a83bae8b84c089e3b78844d4b599816cb66ab8c64f60643483c710d6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a17fd44a83bae8b84c089e3b78844d4b599816cb66ab8c64f60643483c710d6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:40:08 compute-0 podman[259986]: 2025-12-03 21:40:08.777692581 +0000 UTC m=+0.177403454 container init aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_haslett, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:40:08 compute-0 podman[259986]: 2025-12-03 21:40:08.792166049 +0000 UTC m=+0.191876952 container start aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_haslett, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:40:08 compute-0 podman[259986]: 2025-12-03 21:40:08.796442164 +0000 UTC m=+0.196153047 container attach aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_haslett, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 03 21:40:09 compute-0 lvm[260083]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:40:09 compute-0 lvm[260084]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:40:09 compute-0 lvm[260080]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:40:09 compute-0 lvm[260080]: VG ceph_vg0 finished
Dec 03 21:40:09 compute-0 lvm[260083]: VG ceph_vg1 finished
Dec 03 21:40:09 compute-0 lvm[260084]: VG ceph_vg2 finished
Dec 03 21:40:09 compute-0 pedantic_haslett[260003]: {}
Dec 03 21:40:09 compute-0 systemd[1]: libpod-aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316.scope: Deactivated successfully.
Dec 03 21:40:09 compute-0 systemd[1]: libpod-aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316.scope: Consumed 1.496s CPU time.
Dec 03 21:40:09 compute-0 podman[260087]: 2025-12-03 21:40:09.729326442 +0000 UTC m=+0.032349010 container died aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_haslett, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True)
Dec 03 21:40:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-7a17fd44a83bae8b84c089e3b78844d4b599816cb66ab8c64f60643483c710d6-merged.mount: Deactivated successfully.
Dec 03 21:40:09 compute-0 podman[260087]: 2025-12-03 21:40:09.777275129 +0000 UTC m=+0.080297657 container remove aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=pedantic_haslett, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 03 21:40:09 compute-0 systemd[1]: libpod-conmon-aa42f17ec442a7a2017b9347bd00c28aedfbac78c2661eb1dce7f47d6d403316.scope: Deactivated successfully.
Dec 03 21:40:09 compute-0 sudo[259909]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:40:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:09 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:40:09 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:09 compute-0 sudo[260102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:40:09 compute-0 sudo[260102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:40:09 compute-0 sudo[260102]: pam_unix(sudo:session): session closed for user root
Dec 03 21:40:10 compute-0 ceph-mon[75204]: pgmap v1054: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:10 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:40:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1055: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1056: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:12 compute-0 ceph-mon[75204]: pgmap v1055: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:13 compute-0 ceph-mon[75204]: pgmap v1056: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1057: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:15 compute-0 ceph-mon[75204]: pgmap v1057: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1058: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:17 compute-0 ceph-mon[75204]: pgmap v1058: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1059: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:19 compute-0 ceph-mon[75204]: pgmap v1059: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:20 compute-0 podman[260128]: 2025-12-03 21:40:20.172694608 +0000 UTC m=+0.098298200 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 03 21:40:20 compute-0 podman[260127]: 2025-12-03 21:40:20.184986758 +0000 UTC m=+0.110573439 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 03 21:40:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1060: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:40:21
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['vms', 'volumes', 'backups', 'cephfs.cephfs.meta', 'images', '.mgr', 'cephfs.cephfs.data']
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:40:21 compute-0 ceph-mon[75204]: pgmap v1060: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:40:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:40:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:40:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1061: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:23 compute-0 ceph-mon[75204]: pgmap v1061: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1062: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:24 compute-0 nova_compute[241566]: 2025-12-03 21:40:24.565 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:25 compute-0 nova_compute[241566]: 2025-12-03 21:40:25.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:25 compute-0 ceph-mon[75204]: pgmap v1062: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1063: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:26 compute-0 nova_compute[241566]: 2025-12-03 21:40:26.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:26 compute-0 nova_compute[241566]: 2025-12-03 21:40:26.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:40:26 compute-0 nova_compute[241566]: 2025-12-03 21:40:26.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:40:26 compute-0 nova_compute[241566]: 2025-12-03 21:40:26.572 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:40:26 compute-0 nova_compute[241566]: 2025-12-03 21:40:26.572 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:27 compute-0 ceph-mon[75204]: pgmap v1063: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:40:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:40:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1064: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:28 compute-0 nova_compute[241566]: 2025-12-03 21:40:28.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:28 compute-0 nova_compute[241566]: 2025-12-03 21:40:28.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:29 compute-0 ceph-mon[75204]: pgmap v1064: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1065: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.553 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.584 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.584 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.585 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.585 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:40:30 compute-0 nova_compute[241566]: 2025-12-03 21:40:30.586 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:40:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:40:31 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4079128885' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:40:31 compute-0 nova_compute[241566]: 2025-12-03 21:40:31.122 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:40:31 compute-0 nova_compute[241566]: 2025-12-03 21:40:31.343 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:40:31 compute-0 nova_compute[241566]: 2025-12-03 21:40:31.344 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5114MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:40:31 compute-0 nova_compute[241566]: 2025-12-03 21:40:31.344 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:40:31 compute-0 nova_compute[241566]: 2025-12-03 21:40:31.344 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:40:31 compute-0 nova_compute[241566]: 2025-12-03 21:40:31.421 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:40:31 compute-0 nova_compute[241566]: 2025-12-03 21:40:31.421 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:40:31 compute-0 nova_compute[241566]: 2025-12-03 21:40:31.451 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:40:31 compute-0 ceph-mon[75204]: pgmap v1065: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:31 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4079128885' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:40:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:40:32 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2670689828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:40:32 compute-0 nova_compute[241566]: 2025-12-03 21:40:32.024 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.572s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:40:32 compute-0 nova_compute[241566]: 2025-12-03 21:40:32.030 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:40:32 compute-0 nova_compute[241566]: 2025-12-03 21:40:32.054 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:40:32 compute-0 nova_compute[241566]: 2025-12-03 21:40:32.057 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:40:32 compute-0 nova_compute[241566]: 2025-12-03 21:40:32.058 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.714s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:40:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1066: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:32 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2670689828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:40:33 compute-0 ceph-mon[75204]: pgmap v1066: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1067: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:35 compute-0 ceph-mon[75204]: pgmap v1067: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:36 compute-0 podman[260210]: 2025-12-03 21:40:36.207342865 +0000 UTC m=+0.144040229 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 03 21:40:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1068: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:37 compute-0 ceph-mon[75204]: pgmap v1068: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1069: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:39 compute-0 ceph-mon[75204]: pgmap v1069: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1070: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:41 compute-0 ceph-mon[75204]: pgmap v1070: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1071: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:43 compute-0 ceph-mon[75204]: pgmap v1071: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1072: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:45 compute-0 ceph-mon[75204]: pgmap v1072: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1073: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:47 compute-0 ceph-mon[75204]: pgmap v1073: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1074: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:40:48.945 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:40:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:40:48.946 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:40:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:40:48.946 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:40:49 compute-0 ceph-mon[75204]: pgmap v1074: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1075: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:51 compute-0 podman[260237]: 2025-12-03 21:40:51.159918178 +0000 UTC m=+0.084946131 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:40:51 compute-0 podman[260236]: 2025-12-03 21:40:51.163145705 +0000 UTC m=+0.091330253 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:40:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:40:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:40:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:40:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:40:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:40:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:40:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:52 compute-0 ceph-mon[75204]: pgmap v1075: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1076: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:54 compute-0 ceph-mon[75204]: pgmap v1076: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1077: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:56 compute-0 ceph-mon[75204]: pgmap v1077: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1078: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:40:58 compute-0 ceph-mon[75204]: pgmap v1078: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1079: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:40:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:40:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/222267683' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:40:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:40:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/222267683' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:41:00 compute-0 ceph-mon[75204]: pgmap v1079: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/222267683' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:41:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/222267683' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:41:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1080: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:02 compute-0 ceph-mon[75204]: pgmap v1080: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1081: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:04 compute-0 ceph-mon[75204]: pgmap v1081: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1082: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:06 compute-0 ceph-mon[75204]: pgmap v1082: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1083: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:07 compute-0 podman[260276]: 2025-12-03 21:41:07.185507285 +0000 UTC m=+0.121284167 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 03 21:41:08 compute-0 ceph-mon[75204]: pgmap v1083: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1084: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:10 compute-0 sudo[260302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:41:10 compute-0 sudo[260302]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:10 compute-0 sudo[260302]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:10 compute-0 sudo[260327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:41:10 compute-0 sudo[260327]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:10 compute-0 ceph-mon[75204]: pgmap v1084: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1085: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:10 compute-0 sudo[260327]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:41:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:41:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:41:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:41:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:41:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:41:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:41:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:41:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:41:10 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:41:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:41:10 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:41:10 compute-0 sudo[260383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:41:11 compute-0 sudo[260383]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:11 compute-0 sudo[260383]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:11 compute-0 sudo[260408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:41:11 compute-0 sudo[260408]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:41:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:41:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:41:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:41:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:41:11 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:41:11 compute-0 podman[260445]: 2025-12-03 21:41:11.40224756 +0000 UTC m=+0.057817482 container create 9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_goldstine, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:41:11 compute-0 systemd[1]: Started libpod-conmon-9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f.scope.
Dec 03 21:41:11 compute-0 podman[260445]: 2025-12-03 21:41:11.374908396 +0000 UTC m=+0.030478388 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:41:11 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:41:11 compute-0 podman[260445]: 2025-12-03 21:41:11.505131373 +0000 UTC m=+0.160701335 container init 9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_goldstine, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030)
Dec 03 21:41:11 compute-0 podman[260445]: 2025-12-03 21:41:11.517840404 +0000 UTC m=+0.173410326 container start 9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_goldstine, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:41:11 compute-0 podman[260445]: 2025-12-03 21:41:11.521672897 +0000 UTC m=+0.177242829 container attach 9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_goldstine, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 03 21:41:11 compute-0 naughty_goldstine[260461]: 167 167
Dec 03 21:41:11 compute-0 systemd[1]: libpod-9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f.scope: Deactivated successfully.
Dec 03 21:41:11 compute-0 podman[260445]: 2025-12-03 21:41:11.529042624 +0000 UTC m=+0.184612556 container died 9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_goldstine, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:41:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-6678d0ab2994b4fd6426d3be979da4602aad0471c3ab23489cda282537a75fdb-merged.mount: Deactivated successfully.
Dec 03 21:41:11 compute-0 podman[260445]: 2025-12-03 21:41:11.586067326 +0000 UTC m=+0.241637258 container remove 9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=naughty_goldstine, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 03 21:41:11 compute-0 systemd[1]: libpod-conmon-9c668b31aa3ca8d5d216ef6e9ea4faac47eb318b7112416b8262ae4f791a930f.scope: Deactivated successfully.
Dec 03 21:41:11 compute-0 podman[260484]: 2025-12-03 21:41:11.806796503 +0000 UTC m=+0.059758676 container create ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_diffie, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:41:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:11 compute-0 systemd[1]: Started libpod-conmon-ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4.scope.
Dec 03 21:41:11 compute-0 podman[260484]: 2025-12-03 21:41:11.784638138 +0000 UTC m=+0.037600301 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:41:11 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:41:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d14f91001be773318ca5b8083375a223d19971159ee726e68cad2de8a309e8fa/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d14f91001be773318ca5b8083375a223d19971159ee726e68cad2de8a309e8fa/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d14f91001be773318ca5b8083375a223d19971159ee726e68cad2de8a309e8fa/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d14f91001be773318ca5b8083375a223d19971159ee726e68cad2de8a309e8fa/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d14f91001be773318ca5b8083375a223d19971159ee726e68cad2de8a309e8fa/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:11 compute-0 podman[260484]: 2025-12-03 21:41:11.926396673 +0000 UTC m=+0.179358856 container init ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_diffie, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle)
Dec 03 21:41:11 compute-0 podman[260484]: 2025-12-03 21:41:11.940159903 +0000 UTC m=+0.193122086 container start ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_diffie, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030)
Dec 03 21:41:11 compute-0 podman[260484]: 2025-12-03 21:41:11.94488533 +0000 UTC m=+0.197847503 container attach ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_diffie, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:41:12 compute-0 ceph-mon[75204]: pgmap v1085: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:12 compute-0 nervous_diffie[260502]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:41:12 compute-0 nervous_diffie[260502]: --> All data devices are unavailable
Dec 03 21:41:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1086: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:12 compute-0 systemd[1]: libpod-ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4.scope: Deactivated successfully.
Dec 03 21:41:12 compute-0 podman[260484]: 2025-12-03 21:41:12.571421772 +0000 UTC m=+0.824383955 container died ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_diffie, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:41:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-d14f91001be773318ca5b8083375a223d19971159ee726e68cad2de8a309e8fa-merged.mount: Deactivated successfully.
Dec 03 21:41:12 compute-0 podman[260484]: 2025-12-03 21:41:12.628798302 +0000 UTC m=+0.881760485 container remove ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nervous_diffie, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 03 21:41:12 compute-0 systemd[1]: libpod-conmon-ac2d32ffef965c3d707319e5066597ae1634dd4933eb69e657e29624a5e23ff4.scope: Deactivated successfully.
Dec 03 21:41:12 compute-0 sudo[260408]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:12 compute-0 sudo[260533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:41:12 compute-0 sudo[260533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:12 compute-0 sudo[260533]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:12 compute-0 sudo[260558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:41:12 compute-0 sudo[260558]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:13 compute-0 podman[260594]: 2025-12-03 21:41:13.174805652 +0000 UTC m=+0.064118522 container create f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_euclid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030)
Dec 03 21:41:13 compute-0 systemd[1]: Started libpod-conmon-f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f.scope.
Dec 03 21:41:13 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:41:13 compute-0 podman[260594]: 2025-12-03 21:41:13.156417968 +0000 UTC m=+0.045730838 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:41:13 compute-0 podman[260594]: 2025-12-03 21:41:13.267290085 +0000 UTC m=+0.156602965 container init f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_euclid, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:41:13 compute-0 podman[260594]: 2025-12-03 21:41:13.278832855 +0000 UTC m=+0.168145735 container start f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_euclid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=tentacle)
Dec 03 21:41:13 compute-0 podman[260594]: 2025-12-03 21:41:13.282685128 +0000 UTC m=+0.171998078 container attach f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_euclid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:41:13 compute-0 kind_euclid[260610]: 167 167
Dec 03 21:41:13 compute-0 podman[260594]: 2025-12-03 21:41:13.286204113 +0000 UTC m=+0.175516993 container died f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_euclid, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 03 21:41:13 compute-0 systemd[1]: libpod-f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f.scope: Deactivated successfully.
Dec 03 21:41:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-3f4ee4a47f09b3c5a564807046da8cfb1dc2c100c727072d6f0446a3a56f5793-merged.mount: Deactivated successfully.
Dec 03 21:41:13 compute-0 podman[260594]: 2025-12-03 21:41:13.328310904 +0000 UTC m=+0.217623804 container remove f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=kind_euclid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 03 21:41:13 compute-0 systemd[1]: libpod-conmon-f7028cc1ef14a9bdf3977e6facfae2a1a54a92cdf3e72058776ec7d1a859485f.scope: Deactivated successfully.
Dec 03 21:41:13 compute-0 podman[260636]: 2025-12-03 21:41:13.529466204 +0000 UTC m=+0.041114564 container create 0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_fermat, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:41:13 compute-0 systemd[1]: Started libpod-conmon-0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1.scope.
Dec 03 21:41:13 compute-0 podman[260636]: 2025-12-03 21:41:13.511646206 +0000 UTC m=+0.023294556 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:41:13 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:41:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c577a8a24eb18401172afd33f03f1e1545a83997215e5e74c176c97fda66159/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c577a8a24eb18401172afd33f03f1e1545a83997215e5e74c176c97fda66159/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c577a8a24eb18401172afd33f03f1e1545a83997215e5e74c176c97fda66159/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c577a8a24eb18401172afd33f03f1e1545a83997215e5e74c176c97fda66159/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:13 compute-0 podman[260636]: 2025-12-03 21:41:13.650065463 +0000 UTC m=+0.161713813 container init 0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_fermat, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, ceph=True)
Dec 03 21:41:13 compute-0 podman[260636]: 2025-12-03 21:41:13.663536714 +0000 UTC m=+0.175185044 container start 0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_fermat, CEPH_REF=tentacle, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True)
Dec 03 21:41:13 compute-0 podman[260636]: 2025-12-03 21:41:13.666530664 +0000 UTC m=+0.178179034 container attach 0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_fermat, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]: {
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:     "0": [
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:         {
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "devices": [
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "/dev/loop3"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             ],
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_name": "ceph_lv0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_size": "21470642176",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "name": "ceph_lv0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "tags": {
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cluster_name": "ceph",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.crush_device_class": "",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.encrypted": "0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.objectstore": "bluestore",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osd_id": "0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.type": "block",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.vdo": "0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.with_tpm": "0"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             },
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "type": "block",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "vg_name": "ceph_vg0"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:         }
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:     ],
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:     "1": [
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:         {
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "devices": [
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "/dev/loop4"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             ],
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_name": "ceph_lv1",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_size": "21470642176",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "name": "ceph_lv1",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "tags": {
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cluster_name": "ceph",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.crush_device_class": "",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.encrypted": "0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.objectstore": "bluestore",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osd_id": "1",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.type": "block",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.vdo": "0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.with_tpm": "0"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             },
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "type": "block",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "vg_name": "ceph_vg1"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:         }
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:     ],
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:     "2": [
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:         {
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "devices": [
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "/dev/loop5"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             ],
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_name": "ceph_lv2",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_size": "21470642176",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "name": "ceph_lv2",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "tags": {
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.cluster_name": "ceph",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.crush_device_class": "",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.encrypted": "0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.objectstore": "bluestore",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osd_id": "2",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.type": "block",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.vdo": "0",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:                 "ceph.with_tpm": "0"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             },
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "type": "block",
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:             "vg_name": "ceph_vg2"
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:         }
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]:     ]
Dec 03 21:41:13 compute-0 stupefied_fermat[260652]: }
Dec 03 21:41:13 compute-0 systemd[1]: libpod-0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1.scope: Deactivated successfully.
Dec 03 21:41:14 compute-0 podman[260636]: 2025-12-03 21:41:14.000652256 +0000 UTC m=+0.512300656 container died 0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_fermat, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030)
Dec 03 21:41:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-0c577a8a24eb18401172afd33f03f1e1545a83997215e5e74c176c97fda66159-merged.mount: Deactivated successfully.
Dec 03 21:41:14 compute-0 podman[260636]: 2025-12-03 21:41:14.05146333 +0000 UTC m=+0.563111660 container remove 0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=stupefied_fermat, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:41:14 compute-0 systemd[1]: libpod-conmon-0b55453663435efd9fc94e1626184dd3359b304f7154f55eb28ff0e37ad4aac1.scope: Deactivated successfully.
Dec 03 21:41:14 compute-0 sudo[260558]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:14 compute-0 sudo[260672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:41:14 compute-0 sudo[260672]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:14 compute-0 sudo[260672]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:14 compute-0 ceph-mon[75204]: pgmap v1086: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:14 compute-0 sudo[260697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:41:14 compute-0 sudo[260697]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1087: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:14 compute-0 podman[260734]: 2025-12-03 21:41:14.705162811 +0000 UTC m=+0.067566085 container create 11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_beaver, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True)
Dec 03 21:41:14 compute-0 systemd[1]: Started libpod-conmon-11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27.scope.
Dec 03 21:41:14 compute-0 podman[260734]: 2025-12-03 21:41:14.676549683 +0000 UTC m=+0.038953007 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:41:14 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:41:14 compute-0 podman[260734]: 2025-12-03 21:41:14.799551145 +0000 UTC m=+0.161954479 container init 11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_beaver, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True)
Dec 03 21:41:14 compute-0 podman[260734]: 2025-12-03 21:41:14.806654345 +0000 UTC m=+0.169057599 container start 11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_beaver, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:41:14 compute-0 podman[260734]: 2025-12-03 21:41:14.809809731 +0000 UTC m=+0.172222115 container attach 11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_beaver, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:41:14 compute-0 angry_beaver[260752]: 167 167
Dec 03 21:41:14 compute-0 systemd[1]: libpod-11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27.scope: Deactivated successfully.
Dec 03 21:41:14 compute-0 podman[260734]: 2025-12-03 21:41:14.814007303 +0000 UTC m=+0.176410587 container died 11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_beaver, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:41:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-2b1b3550cb6978738dd58861858b033f909702fabd0c508b241a80658426b615-merged.mount: Deactivated successfully.
Dec 03 21:41:14 compute-0 podman[260734]: 2025-12-03 21:41:14.868435114 +0000 UTC m=+0.230838368 container remove 11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_beaver, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030)
Dec 03 21:41:14 compute-0 systemd[1]: libpod-conmon-11f0139ece34d6c1a710545ffddcb249b216d95583f6a9938ad20422f9b0ef27.scope: Deactivated successfully.
Dec 03 21:41:15 compute-0 podman[260775]: 2025-12-03 21:41:15.136005469 +0000 UTC m=+0.069347884 container create 867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_mclean, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:41:15 compute-0 systemd[1]: Started libpod-conmon-867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f.scope.
Dec 03 21:41:15 compute-0 podman[260775]: 2025-12-03 21:41:15.107345449 +0000 UTC m=+0.040687914 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:41:15 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:41:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e3732a8b3ba9becb6e7b6ef80068bfb9268f343510c2b11d3336355abf44142/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e3732a8b3ba9becb6e7b6ef80068bfb9268f343510c2b11d3336355abf44142/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e3732a8b3ba9becb6e7b6ef80068bfb9268f343510c2b11d3336355abf44142/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e3732a8b3ba9becb6e7b6ef80068bfb9268f343510c2b11d3336355abf44142/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:41:15 compute-0 podman[260775]: 2025-12-03 21:41:15.259052842 +0000 UTC m=+0.192395267 container init 867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_mclean, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:41:15 compute-0 podman[260775]: 2025-12-03 21:41:15.271952818 +0000 UTC m=+0.205295223 container start 867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_mclean, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:41:15 compute-0 podman[260775]: 2025-12-03 21:41:15.276539772 +0000 UTC m=+0.209882237 container attach 867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_mclean, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, ceph=True)
Dec 03 21:41:16 compute-0 lvm[260868]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:41:16 compute-0 lvm[260872]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:41:16 compute-0 lvm[260868]: VG ceph_vg0 finished
Dec 03 21:41:16 compute-0 lvm[260872]: VG ceph_vg2 finished
Dec 03 21:41:16 compute-0 lvm[260871]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:41:16 compute-0 lvm[260871]: VG ceph_vg1 finished
Dec 03 21:41:16 compute-0 angry_mclean[260791]: {}
Dec 03 21:41:16 compute-0 systemd[1]: libpod-867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f.scope: Deactivated successfully.
Dec 03 21:41:16 compute-0 podman[260775]: 2025-12-03 21:41:16.178011946 +0000 UTC m=+1.111354351 container died 867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_mclean, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True)
Dec 03 21:41:16 compute-0 systemd[1]: libpod-867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f.scope: Consumed 1.499s CPU time.
Dec 03 21:41:16 compute-0 ceph-mon[75204]: pgmap v1087: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-2e3732a8b3ba9becb6e7b6ef80068bfb9268f343510c2b11d3336355abf44142-merged.mount: Deactivated successfully.
Dec 03 21:41:16 compute-0 podman[260775]: 2025-12-03 21:41:16.51338213 +0000 UTC m=+1.446724535 container remove 867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=angry_mclean, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:41:16 compute-0 systemd[1]: libpod-conmon-867e880f3cc2c8f5e1e597e9859eeb5e8dfa80f75b62a98ef9135e004c80f23f.scope: Deactivated successfully.
Dec 03 21:41:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1088: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:16 compute-0 sudo[260697]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:41:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:41:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:41:16 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:41:16 compute-0 sudo[260890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:41:16 compute-0 sudo[260890]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:41:16 compute-0 sudo[260890]: pam_unix(sudo:session): session closed for user root
Dec 03 21:41:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:17 compute-0 ceph-mon[75204]: pgmap v1088: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:41:17 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:41:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1089: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:19 compute-0 ceph-mon[75204]: pgmap v1089: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1090: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:41:21
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.meta', '.mgr', 'backups', 'volumes', 'images', 'cephfs.cephfs.data', 'vms']
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:41:21 compute-0 ceph-mon[75204]: pgmap v1090: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:41:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:41:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:41:22 compute-0 podman[260915]: 2025-12-03 21:41:22.191279636 +0000 UTC m=+0.105799941 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 03 21:41:22 compute-0 podman[260916]: 2025-12-03 21:41:22.20591732 +0000 UTC m=+0.120754623 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 03 21:41:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1091: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:23 compute-0 ceph-mon[75204]: pgmap v1091: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1092: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:25 compute-0 ceph-mon[75204]: pgmap v1092: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:26 compute-0 nova_compute[241566]: 2025-12-03 21:41:26.053 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:26 compute-0 nova_compute[241566]: 2025-12-03 21:41:26.054 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1093: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:26 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:27 compute-0 nova_compute[241566]: 2025-12-03 21:41:27.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:27 compute-0 nova_compute[241566]: 2025-12-03 21:41:27.551 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:41:27 compute-0 nova_compute[241566]: 2025-12-03 21:41:27.551 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:41:27 compute-0 nova_compute[241566]: 2025-12-03 21:41:27.566 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:41:27 compute-0 nova_compute[241566]: 2025-12-03 21:41:27.567 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:27 compute-0 ceph-mon[75204]: pgmap v1093: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:41:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:41:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1094: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:28 compute-0 nova_compute[241566]: 2025-12-03 21:41:28.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:28 compute-0 nova_compute[241566]: 2025-12-03 21:41:28.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:29 compute-0 ceph-mon[75204]: pgmap v1094: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1095: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:30 compute-0 nova_compute[241566]: 2025-12-03 21:41:30.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:31 compute-0 nova_compute[241566]: 2025-12-03 21:41:31.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:31 compute-0 nova_compute[241566]: 2025-12-03 21:41:31.551 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:41:31 compute-0 nova_compute[241566]: 2025-12-03 21:41:31.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:31 compute-0 nova_compute[241566]: 2025-12-03 21:41:31.585 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:41:31 compute-0 nova_compute[241566]: 2025-12-03 21:41:31.586 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:41:31 compute-0 nova_compute[241566]: 2025-12-03 21:41:31.586 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:41:31 compute-0 nova_compute[241566]: 2025-12-03 21:41:31.586 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:41:31 compute-0 nova_compute[241566]: 2025-12-03 21:41:31.587 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:41:31 compute-0 ceph-mon[75204]: pgmap v1095: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:41:32 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2014570269' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:41:32 compute-0 nova_compute[241566]: 2025-12-03 21:41:32.176 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.589s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:41:32 compute-0 nova_compute[241566]: 2025-12-03 21:41:32.372 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:41:32 compute-0 nova_compute[241566]: 2025-12-03 21:41:32.374 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5107MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:41:32 compute-0 nova_compute[241566]: 2025-12-03 21:41:32.374 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:41:32 compute-0 nova_compute[241566]: 2025-12-03 21:41:32.375 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:41:32 compute-0 nova_compute[241566]: 2025-12-03 21:41:32.465 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:41:32 compute-0 nova_compute[241566]: 2025-12-03 21:41:32.466 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:41:32 compute-0 nova_compute[241566]: 2025-12-03 21:41:32.493 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:41:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1096: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:32 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2014570269' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:41:33 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:41:33 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2601065406' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:41:33 compute-0 nova_compute[241566]: 2025-12-03 21:41:33.059 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.566s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:41:33 compute-0 nova_compute[241566]: 2025-12-03 21:41:33.066 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:41:33 compute-0 nova_compute[241566]: 2025-12-03 21:41:33.091 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:41:33 compute-0 nova_compute[241566]: 2025-12-03 21:41:33.094 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:41:33 compute-0 nova_compute[241566]: 2025-12-03 21:41:33.094 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.720s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:41:33 compute-0 ceph-mon[75204]: pgmap v1096: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:33 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2601065406' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:41:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1097: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:35 compute-0 ceph-mon[75204]: pgmap v1097: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1098: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:37 compute-0 nova_compute[241566]: 2025-12-03 21:41:37.089 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:41:37 compute-0 ceph-mon[75204]: pgmap v1098: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:38 compute-0 podman[260997]: 2025-12-03 21:41:38.203776708 +0000 UTC m=+0.136524336 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 03 21:41:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1099: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:39 compute-0 ceph-mon[75204]: pgmap v1099: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1100: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:41 compute-0 ceph-mon[75204]: pgmap v1100: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1101: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:43 compute-0 ceph-mon[75204]: pgmap v1101: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1102: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:46 compute-0 ceph-mon[75204]: pgmap v1102: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1103: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:48 compute-0 ceph-mon[75204]: pgmap v1103: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1104: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:41:48.949 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:41:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:41:48.949 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:41:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:41:48.950 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:41:49 compute-0 ceph-mon[75204]: pgmap v1104: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #51. Immutable memtables: 0.
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.388358) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 51
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798110388399, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2049, "num_deletes": 251, "total_data_size": 2398933, "memory_usage": 2443008, "flush_reason": "Manual Compaction"}
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #52: started
Dec 03 21:41:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1105: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798110575840, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 52, "file_size": 2315437, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20619, "largest_seqno": 22667, "table_properties": {"data_size": 2306200, "index_size": 5795, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18454, "raw_average_key_size": 19, "raw_value_size": 2287738, "raw_average_value_size": 2470, "num_data_blocks": 266, "num_entries": 926, "num_filter_entries": 926, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764797881, "oldest_key_time": 1764797881, "file_creation_time": 1764798110, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 52, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 188143 microseconds, and 10166 cpu microseconds.
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.576495) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #52: 2315437 bytes OK
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.576523) [db/memtable_list.cc:519] [default] Level-0 commit table #52 started
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.718081) [db/memtable_list.cc:722] [default] Level-0 commit table #52: memtable #1 done
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.718131) EVENT_LOG_v1 {"time_micros": 1764798110718121, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.718160) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2390363, prev total WAL file size 2390363, number of live WAL files 2.
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000048.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.719867) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [52(2261KB)], [50(5707KB)]
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798110719933, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [52], "files_L6": [50], "score": -1, "input_data_size": 8159606, "oldest_snapshot_seqno": -1}
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #53: 4450 keys, 6932108 bytes, temperature: kUnknown
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798110844688, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 53, "file_size": 6932108, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 6898918, "index_size": 20984, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11141, "raw_key_size": 106555, "raw_average_key_size": 23, "raw_value_size": 6815629, "raw_average_value_size": 1531, "num_data_blocks": 893, "num_entries": 4450, "num_filter_entries": 4450, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764798110, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.844988) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 6932108 bytes
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.904089) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 65.4 rd, 55.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 5.6 +0.0 blob) out(6.6 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 4964, records dropped: 514 output_compression: NoCompression
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.904123) EVENT_LOG_v1 {"time_micros": 1764798110904109, "job": 26, "event": "compaction_finished", "compaction_time_micros": 124840, "compaction_time_cpu_micros": 31827, "output_level": 6, "num_output_files": 1, "total_output_size": 6932108, "num_input_records": 4964, "num_output_records": 4450, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000052.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798110905031, "job": 26, "event": "table_file_deletion", "file_number": 52}
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798110907071, "job": 26, "event": "table_file_deletion", "file_number": 50}
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.719755) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.907149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.907157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.907160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.907163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:41:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:41:50.907166) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:41:51 compute-0 ceph-mon[75204]: pgmap v1105: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:41:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:41:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:41:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:41:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:41:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:41:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1106: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:53 compute-0 podman[261025]: 2025-12-03 21:41:53.163167335 +0000 UTC m=+0.087064778 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 03 21:41:53 compute-0 podman[261024]: 2025-12-03 21:41:53.170922734 +0000 UTC m=+0.100468879 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 03 21:41:53 compute-0 ceph-mon[75204]: pgmap v1106: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1107: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:56 compute-0 ceph-mon[75204]: pgmap v1107: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1108: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:41:58 compute-0 ceph-mon[75204]: pgmap v1108: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1109: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:59 compute-0 ceph-mon[75204]: pgmap v1109: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:41:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:41:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1028768076' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:41:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:41:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1028768076' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:42:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1110: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1028768076' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:42:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/1028768076' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:42:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:02 compute-0 ceph-mon[75204]: pgmap v1110: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1111: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:04 compute-0 ceph-mon[75204]: pgmap v1111: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1112: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:05 compute-0 ceph-mon[75204]: pgmap v1112: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1113: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:07 compute-0 ceph-mon[75204]: pgmap v1113: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1114: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:09 compute-0 podman[261064]: 2025-12-03 21:42:09.246278646 +0000 UTC m=+0.177620310 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 03 21:42:10 compute-0 ceph-mon[75204]: pgmap v1114: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1115: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:11 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:12 compute-0 ceph-mon[75204]: pgmap v1115: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1116: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:13 compute-0 nova_compute[241566]: 2025-12-03 21:42:13.739 241570 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 7.71 sec
Dec 03 21:42:13 compute-0 ceph-mon[75204]: pgmap v1116: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1117: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:15 compute-0 ceph-mon[75204]: pgmap v1117: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1118: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:16 compute-0 sudo[261090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:42:16 compute-0 sudo[261090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:16 compute-0 sudo[261090]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:16 compute-0 sudo[261115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ls
Dec 03 21:42:16 compute-0 sudo[261115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:16 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:17 compute-0 podman[261187]: 2025-12-03 21:42:17.459505453 +0000 UTC m=+0.101716542 container exec 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default)
Dec 03 21:42:17 compute-0 podman[261187]: 2025-12-03 21:42:17.575439476 +0000 UTC m=+0.217650555 container exec_died 5be1cf87f4450717250955c63021ef41dba8c1099adb634f3e1f6a0b130fa916 (image=quay.io/ceph/ceph:v20, name=ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mon-compute-0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3)
Dec 03 21:42:17 compute-0 ceph-mon[75204]: pgmap v1118: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:18 compute-0 sudo[261115]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:42:18 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:18 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:42:18 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:18 compute-0 nova_compute[241566]: 2025-12-03 21:42:18.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:42:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1119: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:18 compute-0 sudo[261357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:42:18 compute-0 sudo[261357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:18 compute-0 sudo[261357]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:18 compute-0 sudo[261382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:42:18 compute-0 sudo[261382]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:19 compute-0 sudo[261382]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:42:19 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:42:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:42:19 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:42:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:42:19 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:42:19 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:42:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:42:19 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:42:19 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:42:19 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:42:19 compute-0 sudo[261438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:42:19 compute-0 sudo[261438]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:19 compute-0 sudo[261438]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:19 compute-0 ceph-mon[75204]: pgmap v1119: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:42:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:42:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:42:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:42:19 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:42:19 compute-0 sudo[261463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:42:19 compute-0 sudo[261463]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:19 compute-0 podman[261500]: 2025-12-03 21:42:19.90041503 +0000 UTC m=+0.057376042 container create b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_hamilton, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:42:19 compute-0 systemd[1]: Started libpod-conmon-b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12.scope.
Dec 03 21:42:19 compute-0 podman[261500]: 2025-12-03 21:42:19.880863075 +0000 UTC m=+0.037824067 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:42:19 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:42:19 compute-0 podman[261500]: 2025-12-03 21:42:19.998114763 +0000 UTC m=+0.155075835 container init b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_hamilton, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:42:20 compute-0 podman[261500]: 2025-12-03 21:42:20.010117785 +0000 UTC m=+0.167078777 container start b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_hamilton, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:42:20 compute-0 podman[261500]: 2025-12-03 21:42:20.014154023 +0000 UTC m=+0.171115035 container attach b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_hamilton, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:42:20 compute-0 eager_hamilton[261516]: 167 167
Dec 03 21:42:20 compute-0 systemd[1]: libpod-b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12.scope: Deactivated successfully.
Dec 03 21:42:20 compute-0 podman[261500]: 2025-12-03 21:42:20.017369079 +0000 UTC m=+0.174330081 container died b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_hamilton, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.vendor=CentOS)
Dec 03 21:42:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-e86f355f7cf54491cbd211459ef4e5f67a455d893f877ee869572a67a511fecc-merged.mount: Deactivated successfully.
Dec 03 21:42:20 compute-0 podman[261500]: 2025-12-03 21:42:20.06431309 +0000 UTC m=+0.221274062 container remove b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eager_hamilton, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:42:20 compute-0 systemd[1]: libpod-conmon-b9864daee99cad508d4098f09b004f426d81782a644711116faa4e1537476b12.scope: Deactivated successfully.
Dec 03 21:42:20 compute-0 podman[261539]: 2025-12-03 21:42:20.294728786 +0000 UTC m=+0.054887404 container create 06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rhodes, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:42:20 compute-0 systemd[1]: Started libpod-conmon-06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253.scope.
Dec 03 21:42:20 compute-0 podman[261539]: 2025-12-03 21:42:20.267506426 +0000 UTC m=+0.027665094 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:42:20 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:42:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3292546c045abba634357304018129f3c101ebf61141dad611ab6a959a05600/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3292546c045abba634357304018129f3c101ebf61141dad611ab6a959a05600/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3292546c045abba634357304018129f3c101ebf61141dad611ab6a959a05600/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3292546c045abba634357304018129f3c101ebf61141dad611ab6a959a05600/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3292546c045abba634357304018129f3c101ebf61141dad611ab6a959a05600/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:20 compute-0 podman[261539]: 2025-12-03 21:42:20.39392338 +0000 UTC m=+0.154082008 container init 06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rhodes, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:42:20 compute-0 podman[261539]: 2025-12-03 21:42:20.402963893 +0000 UTC m=+0.163122501 container start 06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rhodes, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:42:20 compute-0 podman[261539]: 2025-12-03 21:42:20.406530988 +0000 UTC m=+0.166689636 container attach 06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rhodes, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030)
Dec 03 21:42:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1120: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:20 compute-0 cool_rhodes[261556]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:42:20 compute-0 cool_rhodes[261556]: --> All data devices are unavailable
Dec 03 21:42:20 compute-0 systemd[1]: libpod-06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253.scope: Deactivated successfully.
Dec 03 21:42:20 compute-0 podman[261539]: 2025-12-03 21:42:20.983090789 +0000 UTC m=+0.743249437 container died 06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rhodes, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 03 21:42:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-d3292546c045abba634357304018129f3c101ebf61141dad611ab6a959a05600-merged.mount: Deactivated successfully.
Dec 03 21:42:21 compute-0 podman[261539]: 2025-12-03 21:42:21.039054511 +0000 UTC m=+0.799213129 container remove 06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=cool_rhodes, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 03 21:42:21 compute-0 systemd[1]: libpod-conmon-06e5764e514414369f6a73774f58b04d368cb3a5a55d0e1cc7ac62ecf0fbd253.scope: Deactivated successfully.
Dec 03 21:42:21 compute-0 sudo[261463]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:21 compute-0 sudo[261589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:42:21 compute-0 sudo[261589]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:21 compute-0 sudo[261589]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:42:21
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['.mgr', 'cephfs.cephfs.data', 'vms', 'images', 'volumes', 'cephfs.cephfs.meta', 'backups']
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:42:21 compute-0 sudo[261614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:42:21 compute-0 sudo[261614]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:21 compute-0 podman[261651]: 2025-12-03 21:42:21.582957464 +0000 UTC m=+0.057522975 container create b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_elion, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS)
Dec 03 21:42:21 compute-0 systemd[1]: Started libpod-conmon-b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242.scope.
Dec 03 21:42:21 compute-0 ceph-mon[75204]: pgmap v1120: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:21 compute-0 podman[261651]: 2025-12-03 21:42:21.565519466 +0000 UTC m=+0.040084977 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:42:21 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:42:21 compute-0 podman[261651]: 2025-12-03 21:42:21.696573105 +0000 UTC m=+0.171138696 container init b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_elion, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:42:21 compute-0 podman[261651]: 2025-12-03 21:42:21.707718724 +0000 UTC m=+0.182284255 container start b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_elion, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:42:21 compute-0 podman[261651]: 2025-12-03 21:42:21.711731392 +0000 UTC m=+0.186296983 container attach b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_elion, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:42:21 compute-0 sad_elion[261667]: 167 167
Dec 03 21:42:21 compute-0 systemd[1]: libpod-b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242.scope: Deactivated successfully.
Dec 03 21:42:21 compute-0 podman[261651]: 2025-12-03 21:42:21.717694742 +0000 UTC m=+0.192260283 container died b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_elion, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:42:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-a9e85a41cafbcde083a0c939a39c641acaa773e3f63dea984bb91bc748942b22-merged.mount: Deactivated successfully.
Dec 03 21:42:21 compute-0 podman[261651]: 2025-12-03 21:42:21.772702519 +0000 UTC m=+0.247268050 container remove b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=sad_elion, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:42:21 compute-0 systemd[1]: libpod-conmon-b9fec94d17a5db13b143dfc1253dda86333fc77dd5ec2d40dfd8d3b1a1a12242.scope: Deactivated successfully.
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:42:21 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:42:21 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:42:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:42:22 compute-0 podman[261692]: 2025-12-03 21:42:22.003934538 +0000 UTC m=+0.065722796 container create 0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_williams, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:42:22 compute-0 systemd[1]: Started libpod-conmon-0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf.scope.
Dec 03 21:42:22 compute-0 podman[261692]: 2025-12-03 21:42:21.971028524 +0000 UTC m=+0.032816842 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:42:22 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:42:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f04551b5697548285dcd1ee17f607b96d022f7ff98d6e2d56fd70d60def0b191/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f04551b5697548285dcd1ee17f607b96d022f7ff98d6e2d56fd70d60def0b191/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f04551b5697548285dcd1ee17f607b96d022f7ff98d6e2d56fd70d60def0b191/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f04551b5697548285dcd1ee17f607b96d022f7ff98d6e2d56fd70d60def0b191/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:22 compute-0 podman[261692]: 2025-12-03 21:42:22.103767638 +0000 UTC m=+0.165555876 container init 0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_williams, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Dec 03 21:42:22 compute-0 podman[261692]: 2025-12-03 21:42:22.117893877 +0000 UTC m=+0.179682135 container start 0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_williams, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:42:22 compute-0 podman[261692]: 2025-12-03 21:42:22.121775982 +0000 UTC m=+0.183564220 container attach 0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_williams, CEPH_REF=tentacle, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:42:22 compute-0 unruffled_williams[261709]: {
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:     "0": [
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:         {
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "devices": [
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "/dev/loop3"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             ],
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_name": "ceph_lv0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_size": "21470642176",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "name": "ceph_lv0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "tags": {
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cluster_name": "ceph",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.crush_device_class": "",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.encrypted": "0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.objectstore": "bluestore",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osd_id": "0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.type": "block",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.vdo": "0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.with_tpm": "0"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             },
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "type": "block",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "vg_name": "ceph_vg0"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:         }
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:     ],
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:     "1": [
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:         {
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "devices": [
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "/dev/loop4"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             ],
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_name": "ceph_lv1",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_size": "21470642176",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "name": "ceph_lv1",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "tags": {
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cluster_name": "ceph",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.crush_device_class": "",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.encrypted": "0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.objectstore": "bluestore",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osd_id": "1",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.type": "block",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.vdo": "0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.with_tpm": "0"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             },
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "type": "block",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "vg_name": "ceph_vg1"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:         }
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:     ],
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:     "2": [
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:         {
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "devices": [
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "/dev/loop5"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             ],
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_name": "ceph_lv2",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_size": "21470642176",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "name": "ceph_lv2",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "tags": {
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.cluster_name": "ceph",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.crush_device_class": "",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.encrypted": "0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.objectstore": "bluestore",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osd_id": "2",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.type": "block",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.vdo": "0",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:                 "ceph.with_tpm": "0"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             },
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "type": "block",
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:             "vg_name": "ceph_vg2"
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:         }
Dec 03 21:42:22 compute-0 unruffled_williams[261709]:     ]
Dec 03 21:42:22 compute-0 unruffled_williams[261709]: }
Dec 03 21:42:22 compute-0 systemd[1]: libpod-0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf.scope: Deactivated successfully.
Dec 03 21:42:22 compute-0 podman[261692]: 2025-12-03 21:42:22.449362487 +0000 UTC m=+0.511150735 container died 0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_williams, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:42:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-f04551b5697548285dcd1ee17f607b96d022f7ff98d6e2d56fd70d60def0b191-merged.mount: Deactivated successfully.
Dec 03 21:42:22 compute-0 podman[261692]: 2025-12-03 21:42:22.498507186 +0000 UTC m=+0.560295414 container remove 0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=unruffled_williams, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default)
Dec 03 21:42:22 compute-0 systemd[1]: libpod-conmon-0e3abade3ce6723b772d9007b5f2d525f86c4d807bca0335da6beb82ebf635bf.scope: Deactivated successfully.
Dec 03 21:42:22 compute-0 sudo[261614]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1121: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:22 compute-0 sudo[261730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:42:22 compute-0 sudo[261730]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:22 compute-0 sudo[261730]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:22 compute-0 sudo[261755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:42:22 compute-0 sudo[261755]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:23 compute-0 podman[261791]: 2025-12-03 21:42:23.09050936 +0000 UTC m=+0.039639135 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:42:23 compute-0 podman[261791]: 2025-12-03 21:42:23.594433571 +0000 UTC m=+0.543563296 container create 42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_shockley, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle)
Dec 03 21:42:23 compute-0 systemd[1]: Started libpod-conmon-42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b.scope.
Dec 03 21:42:23 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:42:23 compute-0 podman[261806]: 2025-12-03 21:42:23.880513081 +0000 UTC m=+0.231247509 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 03 21:42:23 compute-0 podman[261805]: 2025-12-03 21:42:23.886707048 +0000 UTC m=+0.237549379 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 21:42:24 compute-0 ceph-mon[75204]: pgmap v1121: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:24 compute-0 podman[261791]: 2025-12-03 21:42:24.41120959 +0000 UTC m=+1.360339305 container init 42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_shockley, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, ceph=True)
Dec 03 21:42:24 compute-0 podman[261791]: 2025-12-03 21:42:24.418371922 +0000 UTC m=+1.367501647 container start 42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_shockley, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, io.buildah.version=1.41.3)
Dec 03 21:42:24 compute-0 modest_shockley[261833]: 167 167
Dec 03 21:42:24 compute-0 systemd[1]: libpod-42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b.scope: Deactivated successfully.
Dec 03 21:42:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1122: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:25 compute-0 podman[261791]: 2025-12-03 21:42:25.096851689 +0000 UTC m=+2.045981424 container attach 42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_shockley, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 03 21:42:25 compute-0 podman[261791]: 2025-12-03 21:42:25.097367783 +0000 UTC m=+2.046497508 container died 42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_shockley, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:42:25 compute-0 nova_compute[241566]: 2025-12-03 21:42:25.562 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:42:25 compute-0 ceph-mon[75204]: pgmap v1122: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-7023e6cbcd300db409e97a10882b8e3356c4d1563057672ce4e8fd90f2d81f91-merged.mount: Deactivated successfully.
Dec 03 21:42:26 compute-0 podman[261791]: 2025-12-03 21:42:26.474129047 +0000 UTC m=+3.423258782 container remove 42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=modest_shockley, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 03 21:42:26 compute-0 systemd[1]: libpod-conmon-42ed72c759ea3d48d6ce4bfb4209a1b9d03751ec9f6d208f738ec0415ae3840b.scope: Deactivated successfully.
Dec 03 21:42:26 compute-0 nova_compute[241566]: 2025-12-03 21:42:26.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:42:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1123: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:26 compute-0 podman[261870]: 2025-12-03 21:42:26.670051408 +0000 UTC m=+0.027097558 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:42:27 compute-0 podman[261870]: 2025-12-03 21:42:27.068490295 +0000 UTC m=+0.425536425 container create 2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_morse, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:42:27 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:27 compute-0 nova_compute[241566]: 2025-12-03 21:42:27.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:42:27 compute-0 nova_compute[241566]: 2025-12-03 21:42:27.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:42:27 compute-0 nova_compute[241566]: 2025-12-03 21:42:27.552 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:42:27 compute-0 systemd[1]: Started libpod-conmon-2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349.scope.
Dec 03 21:42:27 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:42:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/824779ca59b45ae6611dadb2ed411ec0f241d00059d9f99b908a8e6ae7b0eaba/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/824779ca59b45ae6611dadb2ed411ec0f241d00059d9f99b908a8e6ae7b0eaba/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/824779ca59b45ae6611dadb2ed411ec0f241d00059d9f99b908a8e6ae7b0eaba/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/824779ca59b45ae6611dadb2ed411ec0f241d00059d9f99b908a8e6ae7b0eaba/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:42:27 compute-0 podman[261870]: 2025-12-03 21:42:27.836076404 +0000 UTC m=+1.193122564 container init 2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_morse, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Dec 03 21:42:27 compute-0 podman[261870]: 2025-12-03 21:42:27.8478121 +0000 UTC m=+1.204858260 container start 2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_morse, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:42:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:42:28 compute-0 podman[261870]: 2025-12-03 21:42:28.035394007 +0000 UTC m=+1.392440137 container attach 2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_morse, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 03 21:42:28 compute-0 ceph-mon[75204]: pgmap v1123: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:28 compute-0 lvm[261968]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:42:28 compute-0 lvm[261968]: VG ceph_vg2 finished
Dec 03 21:42:28 compute-0 lvm[261967]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:42:28 compute-0 lvm[261967]: VG ceph_vg1 finished
Dec 03 21:42:28 compute-0 lvm[261964]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:42:28 compute-0 lvm[261964]: VG ceph_vg0 finished
Dec 03 21:42:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1124: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:28 compute-0 brave_morse[261887]: {}
Dec 03 21:42:28 compute-0 systemd[1]: libpod-2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349.scope: Deactivated successfully.
Dec 03 21:42:28 compute-0 systemd[1]: libpod-2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349.scope: Consumed 1.376s CPU time.
Dec 03 21:42:28 compute-0 podman[261870]: 2025-12-03 21:42:28.657223062 +0000 UTC m=+2.014269232 container died 2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_morse, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 03 21:42:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-824779ca59b45ae6611dadb2ed411ec0f241d00059d9f99b908a8e6ae7b0eaba-merged.mount: Deactivated successfully.
Dec 03 21:42:29 compute-0 ceph-mon[75204]: pgmap v1124: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:30 compute-0 podman[261870]: 2025-12-03 21:42:30.246125544 +0000 UTC m=+3.603171714 container remove 2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=brave_morse, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 03 21:42:30 compute-0 systemd[1]: libpod-conmon-2283f010186107884e92d3673032ee6c54c805e8e407d577ca658aed974b2349.scope: Deactivated successfully.
Dec 03 21:42:30 compute-0 sudo[261755]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:42:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1125: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:30 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:42:30 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:30 compute-0 sudo[261986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:42:30 compute-0 sudo[261986]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:42:30 compute-0 sudo[261986]: pam_unix(sudo:session): session closed for user root
Dec 03 21:42:31 compute-0 ceph-mon[75204]: pgmap v1125: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:31 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:31 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:42:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1126: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:34 compute-0 ceph-mon[75204]: pgmap v1126: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1127: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:36 compute-0 ceph-mon[75204]: pgmap v1127: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1128: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:37 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:38 compute-0 ceph-mon[75204]: pgmap v1128: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1129: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:40 compute-0 podman[262011]: 2025-12-03 21:42:40.246139784 +0000 UTC m=+0.154642913 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 03 21:42:40 compute-0 ceph-mon[75204]: pgmap v1129: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1130: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:41 compute-0 ceph-mon[75204]: pgmap v1130: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:42:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1131: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1132: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1133: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1134: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:42:48.951 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:42:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:42:48.951 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:42:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:42:48.951 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:42:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1135: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:50 compute-0 ceph-mds[93586]: mds.beacon.cephfs.compute-0.gzkqle missed beacon ack from the monitors
Dec 03 21:42:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:42:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:42:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:42:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:42:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:42:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:42:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1136: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:54 compute-0 podman[262039]: 2025-12-03 21:42:54.157474271 +0000 UTC m=+0.085305041 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 03 21:42:54 compute-0 podman[262038]: 2025-12-03 21:42:54.162603099 +0000 UTC m=+0.094066696 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 03 21:42:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1137: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:54 compute-0 ceph-mds[93586]: mds.beacon.cephfs.compute-0.gzkqle missed beacon ack from the monitors
Dec 03 21:42:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1138: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:58 compute-0 ceph-mds[93586]: mds.beacon.cephfs.compute-0.gzkqle MDS connection to Monitors appears to be laggy; 15.2094s since last acked beacon
Dec 03 21:42:58 compute-0 ceph-mds[93586]: mds.0.4 skipping upkeep work because connection to Monitors appears laggy
Dec 03 21:42:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1139: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:42:58 compute-0 ceph-mds[93586]: mds.beacon.cephfs.compute-0.gzkqle missed beacon ack from the monitors
Dec 03 21:43:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1140: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1141: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:02 compute-0 ceph-mds[93586]: mds.beacon.cephfs.compute-0.gzkqle missed beacon ack from the monitors
Dec 03 21:43:03 compute-0 ceph-mds[93586]: mds.0.4 skipping upkeep work because connection to Monitors appears laggy
Dec 03 21:43:04 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1142: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).mds e5 check_health: resetting beacon timeouts due to mon delay (slow election?) of 2e+01 seconds
Dec 03 21:43:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:05 compute-0 ceph-mds[93586]: mds.beacon.cephfs.compute-0.gzkqle  MDS is no longer laggy
Dec 03 21:43:05 compute-0 ceph-mon[75204]: pgmap v1131: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.430 241570 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 31.69 sec
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.511 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.512 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.512 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.513 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.513 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.513 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.514 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.514 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.548 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.549 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.550 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.550 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:43:05 compute-0 nova_compute[241566]: 2025-12-03 21:43:05.551 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:43:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:43:05 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3643027809' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:43:05 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:43:05 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3643027809' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:43:06 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:43:06 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3008246827' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.105 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.554s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1132: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1133: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1134: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1135: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1136: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1137: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1138: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1139: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1140: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1141: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: pgmap v1142: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3643027809' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:43:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3643027809' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:43:06 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3008246827' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.343 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.345 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5156MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.345 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.345 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.500 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.501 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:43:06 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1143: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.732 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing inventories for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.784 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Updating ProviderTree inventory for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.785 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Updating inventory in ProviderTree for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.799 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing aggregate associations for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.833 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Refreshing trait associations for resource provider 94aba67c-5c5e-45d0-83d1-33eb467c8775, traits: HW_CPU_X86_SSE4A,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_BMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,HW_CPU_X86_MMX,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_AESNI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_F16C,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 03 21:43:06 compute-0 nova_compute[241566]: 2025-12-03 21:43:06.848 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:43:07 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:43:07 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3631726271' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.366 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.518s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.374 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.406 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.408 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.408 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.409 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.409 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.430 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.431 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:07 compute-0 nova_compute[241566]: 2025-12-03 21:43:07.432 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 03 21:43:08 compute-0 ceph-mon[75204]: pgmap v1143: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:08 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3631726271' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:43:08 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1144: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:10 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:10 compute-0 ceph-mon[75204]: pgmap v1144: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:10 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1145: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:11 compute-0 podman[262123]: 2025-12-03 21:43:11.177939446 +0000 UTC m=+0.116141209 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 03 21:43:12 compute-0 ceph-mon[75204]: pgmap v1145: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:12 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1146: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:14 compute-0 ceph-mon[75204]: pgmap v1146: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:14 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1147: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:15 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:15 compute-0 ceph-mon[75204]: pgmap v1147: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:16 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1148: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:18 compute-0 ceph-mon[75204]: pgmap v1148: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:18 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1149: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:20 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:20 compute-0 ceph-mon[75204]: pgmap v1149: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:20 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1150: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Optimize plan auto_2025-12-03_21:43:21
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [balancer INFO root] do_upmap
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [balancer INFO root] pools ['cephfs.cephfs.meta', 'backups', 'volumes', 'images', 'vms', 'cephfs.cephfs.data', '.mgr']
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [balancer INFO root] prepared 0/10 upmap changes
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:43:21 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: vms, start_after=
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: volumes, start_after=
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: backups, start_after=
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: [rbd_support INFO root] load_schedules: images, start_after=
Dec 03 21:43:22 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1151: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:22 compute-0 ceph-mon[75204]: pgmap v1150: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:23 compute-0 ceph-mon[75204]: pgmap v1151: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:24 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1152: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:25 compute-0 podman[262152]: 2025-12-03 21:43:25.14275306 +0000 UTC m=+0.063928098 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 03 21:43:25 compute-0 podman[262151]: 2025-12-03 21:43:25.142720879 +0000 UTC m=+0.075906959 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 03 21:43:25 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:25 compute-0 ceph-mon[75204]: pgmap v1152: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:26 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1153: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:27 compute-0 ceph-mon[75204]: pgmap v1153: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] _maybe_adjust
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 7.185749983720779e-06 of space, bias 1.0, pg target 0.0021557249951162337 quantized to 1 (current 1)
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 3.533351853729544e-07 of space, bias 1.0, pg target 0.00010600055561188632 quantized to 32 (current 32)
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 9.833582600959152e-08 of space, bias 1.0, pg target 2.9500747802877454e-05 quantized to 32 (current 32)
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0006678894471709225 of space, bias 1.0, pg target 0.20036683415127676 quantized to 32 (current 32)
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.meta' root_id -1 using 7.953496000112683e-07 of space, bias 4.0, pg target 0.0009544195200135219 quantized to 16 (current 16)
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 64411926528
Dec 03 21:43:27 compute-0 ceph-mgr[75500]: [pg_autoscaler INFO root] Pool 'cephfs.cephfs.data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Dec 03 21:43:28 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1154: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:28 compute-0 sshd-session[262191]: Accepted publickey for zuul from 192.168.122.10 port 47482 ssh2: ECDSA SHA256:I6A6wQ90+FV5E3fwFBPJR5gIYfftG9mreaiTk4gUp2c
Dec 03 21:43:28 compute-0 systemd-logind[787]: New session 54 of user zuul.
Dec 03 21:43:28 compute-0 systemd[1]: Started Session 54 of User zuul.
Dec 03 21:43:28 compute-0 sshd-session[262191]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 03 21:43:28 compute-0 sudo[262195]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 03 21:43:28 compute-0 sudo[262195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 03 21:43:29 compute-0 ceph-mon[75204]: pgmap v1154: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:30 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:30 compute-0 nova_compute[241566]: 2025-12-03 21:43:30.494 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:30 compute-0 nova_compute[241566]: 2025-12-03 21:43:30.496 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:30 compute-0 nova_compute[241566]: 2025-12-03 21:43:30.496 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 03 21:43:30 compute-0 nova_compute[241566]: 2025-12-03 21:43:30.496 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 03 21:43:30 compute-0 nova_compute[241566]: 2025-12-03 21:43:30.534 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 03 21:43:30 compute-0 nova_compute[241566]: 2025-12-03 21:43:30.534 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:30 compute-0 nova_compute[241566]: 2025-12-03 21:43:30.534 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:30 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1155: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:30 compute-0 sudo[262331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:43:30 compute-0 sudo[262331]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:30 compute-0 sudo[262331]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:31 compute-0 sudo[262372]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --timeout 895 gather-facts
Dec 03 21:43:31 compute-0 sudo[262372]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:31 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15014 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:31 compute-0 nova_compute[241566]: 2025-12-03 21:43:31.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:31 compute-0 nova_compute[241566]: 2025-12-03 21:43:31.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:31 compute-0 sudo[262372]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:31 compute-0 ceph-mon[75204]: pgmap v1155: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:31 compute-0 ceph-mon[75204]: from='client.15014 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:43:31 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:43:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Dec 03 21:43:31 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:43:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Dec 03 21:43:31 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:43:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Dec 03 21:43:31 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:43:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "auth get", "entity": "client.bootstrap-osd"} v 0)
Dec 03 21:43:31 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:43:31 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:43:31 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:43:31 compute-0 sudo[262462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:43:31 compute-0 sudo[262462]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:31 compute-0 sudo[262462]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:31 compute-0 sudo[262502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 /dev/ceph_vg2/ceph_lv2 --objectstore bluestore --yes --no-systemd
Dec 03 21:43:31 compute-0 sudo[262502]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:32 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15016 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:32 compute-0 podman[262544]: 2025-12-03 21:43:32.315017669 +0000 UTC m=+0.058087804 container create 01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_elgamal, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=tentacle, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:43:32 compute-0 systemd[1]: Started libpod-conmon-01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d.scope.
Dec 03 21:43:32 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:43:32 compute-0 podman[262544]: 2025-12-03 21:43:32.287102727 +0000 UTC m=+0.030172882 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:43:32 compute-0 podman[262544]: 2025-12-03 21:43:32.399231597 +0000 UTC m=+0.142301742 container init 01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_elgamal, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030)
Dec 03 21:43:32 compute-0 podman[262544]: 2025-12-03 21:43:32.405952858 +0000 UTC m=+0.149023013 container start 01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_elgamal, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030)
Dec 03 21:43:32 compute-0 podman[262544]: 2025-12-03 21:43:32.409314139 +0000 UTC m=+0.152384294 container attach 01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_elgamal, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:43:32 compute-0 systemd[1]: libpod-01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d.scope: Deactivated successfully.
Dec 03 21:43:32 compute-0 vibrant_elgamal[262579]: 167 167
Dec 03 21:43:32 compute-0 conmon[262579]: conmon 01dbbd80f8ac54a7841a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d.scope/container/memory.events
Dec 03 21:43:32 compute-0 podman[262544]: 2025-12-03 21:43:32.412877444 +0000 UTC m=+0.155947579 container died 01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_elgamal, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, OSD_FLAVOR=default)
Dec 03 21:43:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-cb60def95016433e9c1772bc4e1d907bdae96dc10c60b2fd1ab206d8a3e1f974-merged.mount: Deactivated successfully.
Dec 03 21:43:32 compute-0 podman[262544]: 2025-12-03 21:43:32.463492727 +0000 UTC m=+0.206562882 container remove 01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=vibrant_elgamal, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:43:32 compute-0 systemd[1]: libpod-conmon-01dbbd80f8ac54a7841a221a006ac85ba98cfa0750edc3a5a1fec80ff14f027d.scope: Deactivated successfully.
Dec 03 21:43:32 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1156: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:32 compute-0 podman[262603]: 2025-12-03 21:43:32.65077699 +0000 UTC m=+0.045757113 container create 6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_borg, CEPH_REF=tentacle, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, ceph=True)
Dec 03 21:43:32 compute-0 systemd[1]: Started libpod-conmon-6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906.scope.
Dec 03 21:43:32 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:43:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f5e65fa36f75a09804316109c16436691c98463d39d92b2ffd7f280ab83a34e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:32 compute-0 podman[262603]: 2025-12-03 21:43:32.633795673 +0000 UTC m=+0.028775796 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:43:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f5e65fa36f75a09804316109c16436691c98463d39d92b2ffd7f280ab83a34e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f5e65fa36f75a09804316109c16436691c98463d39d92b2ffd7f280ab83a34e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f5e65fa36f75a09804316109c16436691c98463d39d92b2ffd7f280ab83a34e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f5e65fa36f75a09804316109c16436691c98463d39d92b2ffd7f280ab83a34e/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:32 compute-0 podman[262603]: 2025-12-03 21:43:32.746120568 +0000 UTC m=+0.141100681 container init 6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_borg, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle)
Dec 03 21:43:32 compute-0 podman[262603]: 2025-12-03 21:43:32.757306049 +0000 UTC m=+0.152286172 container start 6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_borg, io.buildah.version=1.41.3, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:43:32 compute-0 podman[262603]: 2025-12-03 21:43:32.761892063 +0000 UTC m=+0.156872176 container attach 6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_borg, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:43:32 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "status"} v 0)
Dec 03 21:43:32 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/680629300' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 03 21:43:32 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:43:32 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Dec 03 21:43:32 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:43:32 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Dec 03 21:43:32 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "auth get", "entity": "client.bootstrap-osd"} : dispatch
Dec 03 21:43:32 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:43:32 compute-0 ceph-mon[75204]: from='client.15016 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:32 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/680629300' entity='client.admin' cmd={"prefix": "status"} : dispatch
Dec 03 21:43:33 compute-0 thirsty_borg[262620]: --> passed data devices: 0 physical, 3 LVM
Dec 03 21:43:33 compute-0 thirsty_borg[262620]: --> All data devices are unavailable
Dec 03 21:43:33 compute-0 systemd[1]: libpod-6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906.scope: Deactivated successfully.
Dec 03 21:43:33 compute-0 podman[262603]: 2025-12-03 21:43:33.33593948 +0000 UTC m=+0.730919573 container died 6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_borg, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:43:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-2f5e65fa36f75a09804316109c16436691c98463d39d92b2ffd7f280ab83a34e-merged.mount: Deactivated successfully.
Dec 03 21:43:33 compute-0 podman[262603]: 2025-12-03 21:43:33.383659695 +0000 UTC m=+0.778639788 container remove 6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=thirsty_borg, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0)
Dec 03 21:43:33 compute-0 systemd[1]: libpod-conmon-6799f00a58e92e62afa45251e3a23b3cc0d9e99ca2ece2ac38118a0c66fe0906.scope: Deactivated successfully.
Dec 03 21:43:33 compute-0 sudo[262502]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:33 compute-0 sudo[262660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:43:33 compute-0 sudo[262660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:33 compute-0 sudo[262660]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:33 compute-0 nova_compute[241566]: 2025-12-03 21:43:33.551 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:33 compute-0 sudo[262685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- lvm list --format json
Dec 03 21:43:33 compute-0 sudo[262685]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:33 compute-0 ceph-mon[75204]: pgmap v1156: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:33 compute-0 podman[262721]: 2025-12-03 21:43:33.986445317 +0000 UTC m=+0.060034979 container create 4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_blackwell, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 03 21:43:34 compute-0 systemd[1]: Started libpod-conmon-4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61.scope.
Dec 03 21:43:34 compute-0 podman[262721]: 2025-12-03 21:43:33.964741662 +0000 UTC m=+0.038331334 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:43:34 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:43:34 compute-0 podman[262721]: 2025-12-03 21:43:34.090853868 +0000 UTC m=+0.164443530 container init 4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_blackwell, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2)
Dec 03 21:43:34 compute-0 podman[262721]: 2025-12-03 21:43:34.101720591 +0000 UTC m=+0.175310253 container start 4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_blackwell, org.label-schema.license=GPLv2, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 03 21:43:34 compute-0 podman[262721]: 2025-12-03 21:43:34.106036107 +0000 UTC m=+0.179625779 container attach 4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_blackwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 03 21:43:34 compute-0 gallant_blackwell[262737]: 167 167
Dec 03 21:43:34 compute-0 podman[262721]: 2025-12-03 21:43:34.109012197 +0000 UTC m=+0.182601879 container died 4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_blackwell, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:43:34 compute-0 systemd[1]: libpod-4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61.scope: Deactivated successfully.
Dec 03 21:43:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-fd7f5f0b3070a2cdbee76722a3c3f455ced840f90b2ba85fdff9aaef9cb063b5-merged.mount: Deactivated successfully.
Dec 03 21:43:34 compute-0 podman[262721]: 2025-12-03 21:43:34.153523866 +0000 UTC m=+0.227113528 container remove 4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=gallant_blackwell, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 03 21:43:34 compute-0 systemd[1]: libpod-conmon-4a09349cd5ccce9a22ba446c0b43f79c9a83844efec17d07d508b48033106b61.scope: Deactivated successfully.
Dec 03 21:43:34 compute-0 podman[262760]: 2025-12-03 21:43:34.362097532 +0000 UTC m=+0.052418373 container create 59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nice_taussig, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 03 21:43:34 compute-0 systemd[1]: Started libpod-conmon-59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f.scope.
Dec 03 21:43:34 compute-0 podman[262760]: 2025-12-03 21:43:34.337308474 +0000 UTC m=+0.027629395 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:43:34 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:43:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7395a1003d6b6026a709cbe8b0f77fc3f08645e5fc7a66773b377e9c7e5a4b1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7395a1003d6b6026a709cbe8b0f77fc3f08645e5fc7a66773b377e9c7e5a4b1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7395a1003d6b6026a709cbe8b0f77fc3f08645e5fc7a66773b377e9c7e5a4b1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7395a1003d6b6026a709cbe8b0f77fc3f08645e5fc7a66773b377e9c7e5a4b1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:34 compute-0 podman[262760]: 2025-12-03 21:43:34.480106339 +0000 UTC m=+0.170427270 container init 59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nice_taussig, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 03 21:43:34 compute-0 podman[262760]: 2025-12-03 21:43:34.486843301 +0000 UTC m=+0.177164162 container start 59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nice_taussig, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=tentacle, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20251030, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS)
Dec 03 21:43:34 compute-0 podman[262760]: 2025-12-03 21:43:34.48975949 +0000 UTC m=+0.180080341 container attach 59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nice_taussig, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20251030, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:43:34 compute-0 nova_compute[241566]: 2025-12-03 21:43:34.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:34 compute-0 nova_compute[241566]: 2025-12-03 21:43:34.554 241570 DEBUG nova.compute.manager [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 03 21:43:34 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1157: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:34 compute-0 nice_taussig[262781]: {
Dec 03 21:43:34 compute-0 nice_taussig[262781]:     "0": [
Dec 03 21:43:34 compute-0 nice_taussig[262781]:         {
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "devices": [
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "/dev/loop3"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             ],
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_name": "ceph_lv0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_size": "21470642176",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=4d33bc95-baf8-481d-bc78-3b15ffd29872,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "name": "ceph_lv0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "tags": {
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.block_uuid": "PwNUS4-BHZk-ELZr-3583-tr9Z-qLm9-dy1sXV",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cluster_name": "ceph",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.crush_device_class": "",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.encrypted": "0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.objectstore": "bluestore",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osd_fsid": "4d33bc95-baf8-481d-bc78-3b15ffd29872",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osd_id": "0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.type": "block",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.vdo": "0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.with_tpm": "0"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             },
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "type": "block",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "vg_name": "ceph_vg0"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:         }
Dec 03 21:43:34 compute-0 nice_taussig[262781]:     ],
Dec 03 21:43:34 compute-0 nice_taussig[262781]:     "1": [
Dec 03 21:43:34 compute-0 nice_taussig[262781]:         {
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "devices": [
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "/dev/loop4"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             ],
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_name": "ceph_lv1",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_size": "21470642176",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=c4086f1b-ff53-4e63-8dc0-011238d77976,ceph.osd_id=1,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "name": "ceph_lv1",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "path": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "tags": {
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.block_uuid": "T7ogio-vnKG-UnmK-tQf4-EOV3-bgiF-2ulcyV",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cluster_name": "ceph",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.crush_device_class": "",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.encrypted": "0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.objectstore": "bluestore",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osd_fsid": "c4086f1b-ff53-4e63-8dc0-011238d77976",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osd_id": "1",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.type": "block",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.vdo": "0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.with_tpm": "0"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             },
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "type": "block",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "vg_name": "ceph_vg1"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:         }
Dec 03 21:43:34 compute-0 nice_taussig[262781]:     ],
Dec 03 21:43:34 compute-0 nice_taussig[262781]:     "2": [
Dec 03 21:43:34 compute-0 nice_taussig[262781]:         {
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "devices": [
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "/dev/loop5"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             ],
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_name": "ceph_lv2",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_size": "21470642176",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_tags": "ceph.block_device=/dev/ceph_vg2/ceph_lv2,ceph.block_uuid=ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=c21de27e-a7fd-594b-8324-0697ba9aab3a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.objectstore=bluestore,ceph.osd_fsid=abcd6a67-9013-4470-978f-f75da5f33cd4,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "lv_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "name": "ceph_lv2",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "path": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "tags": {
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.block_device": "/dev/ceph_vg2/ceph_lv2",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.block_uuid": "ShGq8u-AIcR-gjG5-ke7J-C2kz-D2Cd-22IOFY",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cephx_lockbox_secret": "",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cluster_fsid": "c21de27e-a7fd-594b-8324-0697ba9aab3a",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.cluster_name": "ceph",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.crush_device_class": "",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.encrypted": "0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.objectstore": "bluestore",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osd_fsid": "abcd6a67-9013-4470-978f-f75da5f33cd4",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osd_id": "2",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.type": "block",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.vdo": "0",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:                 "ceph.with_tpm": "0"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             },
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "type": "block",
Dec 03 21:43:34 compute-0 nice_taussig[262781]:             "vg_name": "ceph_vg2"
Dec 03 21:43:34 compute-0 nice_taussig[262781]:         }
Dec 03 21:43:34 compute-0 nice_taussig[262781]:     ]
Dec 03 21:43:34 compute-0 nice_taussig[262781]: }
Dec 03 21:43:34 compute-0 systemd[1]: libpod-59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f.scope: Deactivated successfully.
Dec 03 21:43:34 compute-0 podman[262760]: 2025-12-03 21:43:34.824723259 +0000 UTC m=+0.515044110 container died 59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nice_taussig, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:43:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-f7395a1003d6b6026a709cbe8b0f77fc3f08645e5fc7a66773b377e9c7e5a4b1-merged.mount: Deactivated successfully.
Dec 03 21:43:34 compute-0 podman[262760]: 2025-12-03 21:43:34.865810466 +0000 UTC m=+0.556131297 container remove 59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=nice_taussig, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Dec 03 21:43:34 compute-0 systemd[1]: libpod-conmon-59779ec8140b78d4036e77376986a5da67d09c3492dc7fb5486a589100bc8b9f.scope: Deactivated successfully.
Dec 03 21:43:34 compute-0 sudo[262685]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:34 compute-0 sudo[262802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 03 21:43:34 compute-0 sudo[262802]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:34 compute-0 sudo[262802]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:35 compute-0 sudo[262827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/c21de27e-a7fd-594b-8324-0697ba9aab3a/cephadm.ed5a13ad26f7f55dd30e9b63855e4e581fd86973bec1d21a12ed0bb26af19c8b --image quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86 --timeout 895 ceph-volume --fsid c21de27e-a7fd-594b-8324-0697ba9aab3a -- raw list --format json
Dec 03 21:43:35 compute-0 sudo[262827]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:35 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:35 compute-0 podman[262874]: 2025-12-03 21:43:35.39072026 +0000 UTC m=+0.049511424 container create bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_jackson, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20251030, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, org.label-schema.schema-version=1.0)
Dec 03 21:43:35 compute-0 systemd[1]: Started libpod-conmon-bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f.scope.
Dec 03 21:43:35 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:43:35 compute-0 podman[262874]: 2025-12-03 21:43:35.371984115 +0000 UTC m=+0.030775259 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:43:35 compute-0 podman[262874]: 2025-12-03 21:43:35.480101587 +0000 UTC m=+0.138892751 container init bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_jackson, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.41.3, ceph=True, org.label-schema.build-date=20251030, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:43:35 compute-0 podman[262874]: 2025-12-03 21:43:35.491697649 +0000 UTC m=+0.150488783 container start bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_jackson, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=tentacle, org.label-schema.schema-version=1.0, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 03 21:43:35 compute-0 podman[262874]: 2025-12-03 21:43:35.494842434 +0000 UTC m=+0.153633568 container attach bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_jackson, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 03 21:43:35 compute-0 busy_jackson[262899]: 167 167
Dec 03 21:43:35 compute-0 systemd[1]: libpod-bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f.scope: Deactivated successfully.
Dec 03 21:43:35 compute-0 podman[262874]: 2025-12-03 21:43:35.500525206 +0000 UTC m=+0.159316370 container died bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_jackson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.41.3, org.label-schema.build-date=20251030, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=tentacle, org.label-schema.license=GPLv2, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489)
Dec 03 21:43:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-bad2ef6951d78e139d0e3b290d97ac1138d6fb83dbdf61a07047500a4da5e981-merged.mount: Deactivated successfully.
Dec 03 21:43:35 compute-0 podman[262874]: 2025-12-03 21:43:35.550891253 +0000 UTC m=+0.209682387 container remove bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=busy_jackson, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20251030, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 03 21:43:35 compute-0 nova_compute[241566]: 2025-12-03 21:43:35.552 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:35 compute-0 ovs-vsctl[262920]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 03 21:43:35 compute-0 systemd[1]: libpod-conmon-bba68ce820ec2cc0f2f2d45aa4e80ce740a5c6c0a25f2459db34826d6dc6f21f.scope: Deactivated successfully.
Dec 03 21:43:35 compute-0 nova_compute[241566]: 2025-12-03 21:43:35.585 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:43:35 compute-0 nova_compute[241566]: 2025-12-03 21:43:35.586 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:43:35 compute-0 nova_compute[241566]: 2025-12-03 21:43:35.586 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:43:35 compute-0 nova_compute[241566]: 2025-12-03 21:43:35.587 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 03 21:43:35 compute-0 nova_compute[241566]: 2025-12-03 21:43:35.587 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:43:35 compute-0 podman[262962]: 2025-12-03 21:43:35.790055113 +0000 UTC m=+0.059861663 container create 7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eloquent_nightingale, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=tentacle, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 03 21:43:35 compute-0 ceph-mon[75204]: pgmap v1157: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:35 compute-0 systemd[1]: Started libpod-conmon-7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714.scope.
Dec 03 21:43:35 compute-0 podman[262962]: 2025-12-03 21:43:35.765266356 +0000 UTC m=+0.035072986 image pull 524f3da276461682bec27427fb8a63b5139c40ad4185939aede197474a6817b3 quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86
Dec 03 21:43:35 compute-0 systemd[1]: Started libcrun container.
Dec 03 21:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2310d563a1dfab11160840e22124782f875634f2b6767445fe8ad4fa0372419/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2310d563a1dfab11160840e22124782f875634f2b6767445fe8ad4fa0372419/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2310d563a1dfab11160840e22124782f875634f2b6767445fe8ad4fa0372419/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2310d563a1dfab11160840e22124782f875634f2b6767445fe8ad4fa0372419/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 03 21:43:35 compute-0 podman[262962]: 2025-12-03 21:43:35.89093566 +0000 UTC m=+0.160742230 container init 7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eloquent_nightingale, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=tentacle, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.41.3, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 03 21:43:35 compute-0 podman[262962]: 2025-12-03 21:43:35.902772788 +0000 UTC m=+0.172579328 container start 7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eloquent_nightingale, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Dec 03 21:43:35 compute-0 podman[262962]: 2025-12-03 21:43:35.905839291 +0000 UTC m=+0.175645861 container attach 7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eloquent_nightingale, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=tentacle, org.label-schema.build-date=20251030, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Dec 03 21:43:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:43:36 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/504657527' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.116 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.267 241570 WARNING nova.virt.libvirt.driver [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.269 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=4995MB free_disk=59.988260054029524GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.269 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.269 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.336 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.337 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.350 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 03 21:43:36 compute-0 virtqemud[241184]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 03 21:43:36 compute-0 virtqemud[241184]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 03 21:43:36 compute-0 virtqemud[241184]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 03 21:43:36 compute-0 lvm[263204]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:43:36 compute-0 lvm[263204]: VG ceph_vg0 finished
Dec 03 21:43:36 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1158: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:36 compute-0 lvm[263208]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:43:36 compute-0 lvm[263208]: VG ceph_vg1 finished
Dec 03 21:43:36 compute-0 lvm[263213]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:43:36 compute-0 lvm[263213]: VG ceph_vg2 finished
Dec 03 21:43:36 compute-0 lvm[263215]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:43:36 compute-0 lvm[263215]: VG ceph_vg0 finished
Dec 03 21:43:36 compute-0 eloquent_nightingale[263007]: {}
Dec 03 21:43:36 compute-0 systemd[1]: libpod-7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714.scope: Deactivated successfully.
Dec 03 21:43:36 compute-0 systemd[1]: libpod-7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714.scope: Consumed 1.322s CPU time.
Dec 03 21:43:36 compute-0 conmon[263007]: conmon 7b401a3d65fc0cb9e448 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714.scope/container/memory.events
Dec 03 21:43:36 compute-0 podman[262962]: 2025-12-03 21:43:36.76256282 +0000 UTC m=+1.032369380 container died 7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eloquent_nightingale, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251030, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, CEPH_REF=tentacle, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Dec 03 21:43:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-a2310d563a1dfab11160840e22124782f875634f2b6767445fe8ad4fa0372419-merged.mount: Deactivated successfully.
Dec 03 21:43:36 compute-0 podman[262962]: 2025-12-03 21:43:36.817660774 +0000 UTC m=+1.087467314 container remove 7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714 (image=quay.io/ceph/ceph@sha256:1228c3d05e45fbc068a8c33614e4409b6dac688bcc77369b06009b5830fa8d86, name=eloquent_nightingale, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=tentacle, ceph=True, org.label-schema.build-date=20251030, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=69f84cc2651aa259a15bc192ddaabd3baba07489, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 03 21:43:36 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/504657527' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:43:36 compute-0 systemd[1]: libpod-conmon-7b401a3d65fc0cb9e44889a2d37a92f1d21a80d4eb7f0d1f23329050d33ca714.scope: Deactivated successfully.
Dec 03 21:43:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 03 21:43:36 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3255983988' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.875 241570 DEBUG oslo_concurrency.processutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.884 241570 DEBUG nova.compute.provider_tree [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed in ProviderTree for provider: 94aba67c-5c5e-45d0-83d1-33eb467c8775 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 03 21:43:36 compute-0 sudo[262827]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0.devices.0}] v 0)
Dec 03 21:43:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:43:36 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.compute-0}] v 0)
Dec 03 21:43:36 compute-0 ceph-mon[75204]: log_channel(audit) log [INF] : from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.916 241570 DEBUG nova.scheduler.client.report [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Inventory has not changed for provider 94aba67c-5c5e-45d0-83d1-33eb467c8775 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.918 241570 DEBUG nova.compute.resource_tracker [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 03 21:43:36 compute-0 nova_compute[241566]: 2025-12-03 21:43:36.918 241570 DEBUG oslo_concurrency.lockutils [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:43:36 compute-0 sudo[263295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 03 21:43:36 compute-0 sudo[263295]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 03 21:43:36 compute-0 sudo[263295]: pam_unix(sudo:session): session closed for user root
Dec 03 21:43:37 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: cache status {prefix=cache status} (starting...)
Dec 03 21:43:37 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: client ls {prefix=client ls} (starting...)
Dec 03 21:43:37 compute-0 lvm[263444]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 03 21:43:37 compute-0 lvm[263444]: VG ceph_vg0 finished
Dec 03 21:43:37 compute-0 lvm[263452]: PV /dev/loop5 online, VG ceph_vg2 is complete.
Dec 03 21:43:37 compute-0 lvm[263452]: VG ceph_vg2 finished
Dec 03 21:43:37 compute-0 lvm[263484]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Dec 03 21:43:37 compute-0 lvm[263484]: VG ceph_vg1 finished
Dec 03 21:43:37 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15024 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:37 compute-0 ceph-mon[75204]: pgmap v1158: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:37 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3255983988' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Dec 03 21:43:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:43:37 compute-0 ceph-mon[75204]: from='mgr.14122 192.168.122.100:0/2726886298' entity='mgr.compute-0.jxauqt' 
Dec 03 21:43:37 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: damage ls {prefix=damage ls} (starting...)
Dec 03 21:43:37 compute-0 nova_compute[241566]: 2025-12-03 21:43:37.912 241570 DEBUG oslo_service.periodic_task [None req-71bc7e37-c187-47ed-86bc-0ddfb5097d38 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 03 21:43:38 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump loads {prefix=dump loads} (starting...)
Dec 03 21:43:38 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 03 21:43:38 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15026 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:38 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 03 21:43:38 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 03 21:43:38 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 03 21:43:38 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15028 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:38 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1159: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:38 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "report"} v 0)
Dec 03 21:43:38 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/382969193' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 03 21:43:38 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 03 21:43:38 compute-0 ceph-mon[75204]: from='client.15024 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:38 compute-0 ceph-mon[75204]: from='client.15026 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:38 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/382969193' entity='client.admin' cmd={"prefix": "report"} : dispatch
Dec 03 21:43:38 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 03 21:43:39 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15032 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:39 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: 2025-12-03T21:43:39.095+0000 7fa8c63a3640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 03 21:43:39 compute-0 ceph-mgr[75500]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 03 21:43:39 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: ops {prefix=ops} (starting...)
Dec 03 21:43:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 03 21:43:39 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/784569290' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:43:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 03 21:43:39 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3601736007' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 03 21:43:39 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config log"} v 0)
Dec 03 21:43:39 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/620582623' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 03 21:43:39 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: session ls {prefix=session ls} (starting...)
Dec 03 21:43:39 compute-0 ceph-mon[75204]: from='client.15028 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:39 compute-0 ceph-mon[75204]: pgmap v1159: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:39 compute-0 ceph-mon[75204]: from='client.15032 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:39 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/784569290' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Dec 03 21:43:39 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3601736007' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Dec 03 21:43:39 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/620582623' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Dec 03 21:43:39 compute-0 ceph-mds[93586]: mds.cephfs.compute-0.gzkqle asok_command: status {prefix=status} (starting...)
Dec 03 21:43:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 03 21:43:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2292369582' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 03 21:43:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 03 21:43:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1133625465' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 03 21:43:40 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1160: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:40 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 03 21:43:40 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1088593252' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 03 21:43:40 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15046 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2292369582' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 03 21:43:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1133625465' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Dec 03 21:43:40 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1088593252' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 03 21:43:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 03 21:43:41 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1843679869' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 03 21:43:41 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15050 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 03 21:43:41 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3593633270' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:43:41 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "features"} v 0)
Dec 03 21:43:41 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2892314225' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 03 21:43:41 compute-0 ceph-mon[75204]: pgmap v1160: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:41 compute-0 ceph-mon[75204]: from='client.15046 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:41 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1843679869' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 03 21:43:41 compute-0 ceph-mon[75204]: from='client.15050 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:41 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3593633270' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:43:41 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2892314225' entity='client.admin' cmd={"prefix": "features"} : dispatch
Dec 03 21:43:42 compute-0 podman[264012]: 2025-12-03 21:43:42.158431957 +0000 UTC m=+0.094593279 container health_status eea7b0f6b73da2db5c525e534cccce9e79c080e649bef7a358a1c660ff7b788b (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 03 21:43:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 03 21:43:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4279544096' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 03 21:43:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 03 21:43:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1744587186' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 03 21:43:42 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1161: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:42 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15062 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:42 compute-0 ceph-mgr[75500]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec 03 21:43:42 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: 2025-12-03T21:43:42.767+0000 7fa8c63a3640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Dec 03 21:43:42 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 03 21:43:42 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2889469172' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 03 21:43:42 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4279544096' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Dec 03 21:43:42 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1744587186' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Dec 03 21:43:42 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2889469172' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 03 21:43:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 03 21:43:43 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3284278167' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 03 21:43:43 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15066 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 59 sent 57 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:33.149898+0000 osd.2 (osd.2) 58 : cluster [DBG] 4.18 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:33.160452+0000 osd.2 (osd.2) 59 : cluster [DBG] 4.18 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 59)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:33.149898+0000 osd.2 (osd.2) 58 : cluster [DBG] 4.18 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:33.160452+0000 osd.2 (osd.2) 59 : cluster [DBG] 4.18 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62783488 unmapped: 122880 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:04.208120+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62791680 unmapped: 114688 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 425895 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:05.208231+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 61 sent 59 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:35.159020+0000 osd.2 (osd.2) 60 : cluster [DBG] 4.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:35.169684+0000 osd.2 (osd.2) 61 : cluster [DBG] 4.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 61)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:35.159020+0000 osd.2 (osd.2) 60 : cluster [DBG] 4.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:35.169684+0000 osd.2 (osd.2) 61 : cluster [DBG] 4.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62791680 unmapped: 114688 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:06.208389+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62791680 unmapped: 114688 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:07.208525+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62799872 unmapped: 106496 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:08.208717+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62799872 unmapped: 106496 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:09.208886+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62808064 unmapped: 98304 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 428308 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:10.209040+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 63 sent 61 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:40.195782+0000 osd.2 (osd.2) 62 : cluster [DBG] 4.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:40.206247+0000 osd.2 (osd.2) 63 : cluster [DBG] 4.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 63)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:40.195782+0000 osd.2 (osd.2) 62 : cluster [DBG] 4.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:40.206247+0000 osd.2 (osd.2) 63 : cluster [DBG] 4.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62808064 unmapped: 98304 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:11.209217+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 1 last_log 64 sent 63 num 1 unsent 1 sending 1
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:41.203443+0000 osd.2 (osd.2) 64 : cluster [DBG] 3.18 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 64)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:41.203443+0000 osd.2 (osd.2) 64 : cluster [DBG] 3.18 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62816256 unmapped: 90112 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:12.209419+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 1 last_log 65 sent 64 num 1 unsent 1 sending 1
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:41.214016+0000 osd.2 (osd.2) 65 : cluster [DBG] 3.18 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.105876923s of 10.125753403s, submitted: 10
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 65)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:41.214016+0000 osd.2 (osd.2) 65 : cluster [DBG] 3.18 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 81920 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:13.209631+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:42.230851+0000 osd.2 (osd.2) 66 : cluster [DBG] 7.1c scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:42.241414+0000 osd.2 (osd.2) 67 : cluster [DBG] 7.1c scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 67)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:42.230851+0000 osd.2 (osd.2) 66 : cluster [DBG] 7.1c scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:42.241414+0000 osd.2 (osd.2) 67 : cluster [DBG] 7.1c scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62824448 unmapped: 81920 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:14.209779+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:44.186057+0000 osd.2 (osd.2) 68 : cluster [DBG] 3.16 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:44.196617+0000 osd.2 (osd.2) 69 : cluster [DBG] 3.16 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 69)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:44.186057+0000 osd.2 (osd.2) 68 : cluster [DBG] 3.16 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:44.196617+0000 osd.2 (osd.2) 69 : cluster [DBG] 3.16 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62840832 unmapped: 65536 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 437958 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:15.209921+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62840832 unmapped: 65536 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:16.210049+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62840832 unmapped: 65536 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:17.210193+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62873600 unmapped: 32768 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:18.210325+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:47.280854+0000 osd.2 (osd.2) 70 : cluster [DBG] 7.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:47.291412+0000 osd.2 (osd.2) 71 : cluster [DBG] 7.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 71)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:47.280854+0000 osd.2 (osd.2) 70 : cluster [DBG] 7.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:47.291412+0000 osd.2 (osd.2) 71 : cluster [DBG] 7.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62873600 unmapped: 32768 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:19.210503+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62881792 unmapped: 24576 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 440371 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:20.210666+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62889984 unmapped: 16384 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:21.210813+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62898176 unmapped: 8192 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:22.210961+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62906368 unmapped: 0 heap: 62906368 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:23.211123+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.043985367s of 11.063570023s, submitted: 6
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62914560 unmapped: 1040384 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:24.211262+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:53.294122+0000 osd.2 (osd.2) 72 : cluster [DBG] 4.1 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:53.304844+0000 osd.2 (osd.2) 73 : cluster [DBG] 4.1 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 73)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:53.294122+0000 osd.2 (osd.2) 72 : cluster [DBG] 4.1 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:53.304844+0000 osd.2 (osd.2) 73 : cluster [DBG] 4.1 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62922752 unmapped: 1032192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 442782 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:25.211476+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.a scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.a scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62922752 unmapped: 1032192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:26.211664+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:55.319112+0000 osd.2 (osd.2) 74 : cluster [DBG] 7.a scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:55.329812+0000 osd.2 (osd.2) 75 : cluster [DBG] 7.a scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 75)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:55.319112+0000 osd.2 (osd.2) 74 : cluster [DBG] 7.a scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:55.329812+0000 osd.2 (osd.2) 75 : cluster [DBG] 7.a scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62922752 unmapped: 1032192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:27.211892+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:56.324761+0000 osd.2 (osd.2) 76 : cluster [DBG] 3.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:56.335383+0000 osd.2 (osd.2) 77 : cluster [DBG] 3.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 77)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:56.324761+0000 osd.2 (osd.2) 76 : cluster [DBG] 3.11 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:56.335383+0000 osd.2 (osd.2) 77 : cluster [DBG] 3.11 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62930944 unmapped: 1024000 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:28.212190+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:57.328208+0000 osd.2 (osd.2) 78 : cluster [DBG] 7.15 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:57.338655+0000 osd.2 (osd.2) 79 : cluster [DBG] 7.15 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 79)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:57.328208+0000 osd.2 (osd.2) 78 : cluster [DBG] 7.15 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:57.338655+0000 osd.2 (osd.2) 79 : cluster [DBG] 7.15 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62939136 unmapped: 1015808 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:29.212495+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62947328 unmapped: 1007616 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 452430 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:30.212647+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:59.343361+0000 osd.2 (osd.2) 80 : cluster [DBG] 3.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:11:59.353960+0000 osd.2 (osd.2) 81 : cluster [DBG] 3.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 81)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:59.343361+0000 osd.2 (osd.2) 80 : cluster [DBG] 3.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:11:59.353960+0000 osd.2 (osd.2) 81 : cluster [DBG] 3.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62955520 unmapped: 999424 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:31.212874+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62963712 unmapped: 991232 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:32.213040+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62971904 unmapped: 983040 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:33.213255+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:02.380735+0000 osd.2 (osd.2) 82 : cluster [DBG] 7.8 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:02.391278+0000 osd.2 (osd.2) 83 : cluster [DBG] 7.8 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 83)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:02.380735+0000 osd.2 (osd.2) 82 : cluster [DBG] 7.8 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:02.391278+0000 osd.2 (osd.2) 83 : cluster [DBG] 7.8 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 62980096 unmapped: 974848 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:34.213509+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.994940758s of 11.102183342s, submitted: 12
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63012864 unmapped: 942080 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 457252 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:35.213680+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:04.396685+0000 osd.2 (osd.2) 84 : cluster [DBG] 7.5 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:04.407272+0000 osd.2 (osd.2) 85 : cluster [DBG] 7.5 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 85)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:04.396685+0000 osd.2 (osd.2) 84 : cluster [DBG] 7.5 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:04.407272+0000 osd.2 (osd.2) 85 : cluster [DBG] 7.5 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63012864 unmapped: 942080 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:36.213871+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:05.348833+0000 osd.2 (osd.2) 86 : cluster [DBG] 7.2 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:05.359416+0000 osd.2 (osd.2) 87 : cluster [DBG] 7.2 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 87)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:05.348833+0000 osd.2 (osd.2) 86 : cluster [DBG] 7.2 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:05.359416+0000 osd.2 (osd.2) 87 : cluster [DBG] 7.2 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63021056 unmapped: 933888 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:37.214074+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63021056 unmapped: 933888 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:38.214245+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:07.350511+0000 osd.2 (osd.2) 88 : cluster [DBG] 7.1 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:07.361110+0000 osd.2 (osd.2) 89 : cluster [DBG] 7.1 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 89)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:07.350511+0000 osd.2 (osd.2) 88 : cluster [DBG] 7.1 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:07.361110+0000 osd.2 (osd.2) 89 : cluster [DBG] 7.1 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63029248 unmapped: 925696 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:39.214504+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63029248 unmapped: 925696 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 462074 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:40.214688+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63045632 unmapped: 909312 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:41.214935+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63053824 unmapped: 901120 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:42.215108+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63053824 unmapped: 901120 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:43.215336+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63062016 unmapped: 892928 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:44.215547+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:13.415133+0000 osd.2 (osd.2) 90 : cluster [DBG] 3.7 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:13.425765+0000 osd.2 (osd.2) 91 : cluster [DBG] 3.7 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63070208 unmapped: 884736 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 464485 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 91)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:13.415133+0000 osd.2 (osd.2) 90 : cluster [DBG] 3.7 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:13.425765+0000 osd.2 (osd.2) 91 : cluster [DBG] 3.7 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:45.215805+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63070208 unmapped: 884736 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:46.215980+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.c scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.941810608s of 11.959441185s, submitted: 8
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.c scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63078400 unmapped: 876544 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:47.216162+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:16.356251+0000 osd.2 (osd.2) 92 : cluster [DBG] 7.c scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:16.366904+0000 osd.2 (osd.2) 93 : cluster [DBG] 7.c scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63094784 unmapped: 860160 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 93)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:16.356251+0000 osd.2 (osd.2) 92 : cluster [DBG] 7.c scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:16.366904+0000 osd.2 (osd.2) 93 : cluster [DBG] 7.c scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:48.216325+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:17.373947+0000 osd.2 (osd.2) 94 : cluster [DBG] 3.1d scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:17.384530+0000 osd.2 (osd.2) 95 : cluster [DBG] 3.1d scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63111168 unmapped: 843776 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 95)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:17.373947+0000 osd.2 (osd.2) 94 : cluster [DBG] 3.1d scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:17.384530+0000 osd.2 (osd.2) 95 : cluster [DBG] 3.1d scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:49.216489+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:18.407332+0000 osd.2 (osd.2) 96 : cluster [DBG] 7.1a scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:18.417911+0000 osd.2 (osd.2) 97 : cluster [DBG] 7.1a scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63119360 unmapped: 835584 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 471722 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 97)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:18.407332+0000 osd.2 (osd.2) 96 : cluster [DBG] 7.1a scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:18.417911+0000 osd.2 (osd.2) 97 : cluster [DBG] 7.1a scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:50.216705+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63119360 unmapped: 835584 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:51.216828+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63127552 unmapped: 827392 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:52.216963+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63135744 unmapped: 819200 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:53.217105+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63143936 unmapped: 811008 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:54.217289+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:23.418645+0000 osd.2 (osd.2) 98 : cluster [DBG] 4.13 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:23.429196+0000 osd.2 (osd.2) 99 : cluster [DBG] 4.13 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 99)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:23.418645+0000 osd.2 (osd.2) 98 : cluster [DBG] 4.13 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:23.429196+0000 osd.2 (osd.2) 99 : cluster [DBG] 4.13 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63160320 unmapped: 794624 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 474135 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:55.217486+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63160320 unmapped: 794624 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:56.217785+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:25.473604+0000 osd.2 (osd.2) 100 : cluster [DBG] 3.5 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:25.484176+0000 osd.2 (osd.2) 101 : cluster [DBG] 3.5 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 101)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:25.473604+0000 osd.2 (osd.2) 100 : cluster [DBG] 3.5 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:25.484176+0000 osd.2 (osd.2) 101 : cluster [DBG] 3.5 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.860826492s of 10.089083672s, submitted: 10
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63168512 unmapped: 786432 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:57.218035+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:26.445208+0000 osd.2 (osd.2) 102 : cluster [DBG] 7.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:26.455854+0000 osd.2 (osd.2) 103 : cluster [DBG] 7.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 103)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:26.445208+0000 osd.2 (osd.2) 102 : cluster [DBG] 7.e scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:26.455854+0000 osd.2 (osd.2) 103 : cluster [DBG] 7.e scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63168512 unmapped: 786432 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:58.218296+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63176704 unmapped: 778240 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:59.218475+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63176704 unmapped: 778240 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 481368 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:00.218638+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:29.434275+0000 osd.2 (osd.2) 104 : cluster [DBG] 6.8 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:29.444869+0000 osd.2 (osd.2) 105 : cluster [DBG] 6.8 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 105)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:29.434275+0000 osd.2 (osd.2) 104 : cluster [DBG] 6.8 scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:29.444869+0000 osd.2 (osd.2) 105 : cluster [DBG] 6.8 scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63176704 unmapped: 778240 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:01.218854+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.f scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_channel(cluster) log [DBG] : 6.f scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63184896 unmapped: 770048 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:02.219005+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:31.509926+0000 osd.2 (osd.2) 106 : cluster [DBG] 6.f scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  will send 2025-12-03T21:12:31.530908+0000 osd.2 (osd.2) 107 : cluster [DBG] 6.f scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client handle_log_ack log(last 107)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:31.509926+0000 osd.2 (osd.2) 106 : cluster [DBG] 6.f scrub starts
Dec 03 21:43:43 compute-0 ceph-osd[88129]: log_client  logged 2025-12-03T21:12:31.530908+0000 osd.2 (osd.2) 107 : cluster [DBG] 6.f scrub ok
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63201280 unmapped: 753664 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:03.219205+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63209472 unmapped: 745472 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:04.219345+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63209472 unmapped: 745472 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:05.219501+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63209472 unmapped: 745472 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:06.219655+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63217664 unmapped: 737280 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:07.219993+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63217664 unmapped: 737280 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:08.220149+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63225856 unmapped: 729088 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:09.220310+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63234048 unmapped: 720896 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:10.220469+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63242240 unmapped: 712704 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:11.220672+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63242240 unmapped: 712704 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:12.220833+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63242240 unmapped: 712704 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:13.221011+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63250432 unmapped: 704512 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:14.221193+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63250432 unmapped: 704512 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:15.221683+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63275008 unmapped: 679936 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:16.221854+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63283200 unmapped: 671744 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:17.222052+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 663552 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:18.222168+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 663552 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:19.222371+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63291392 unmapped: 663552 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:20.222707+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63299584 unmapped: 655360 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:21.222867+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63307776 unmapped: 647168 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:22.222985+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63307776 unmapped: 647168 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:23.223212+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63315968 unmapped: 638976 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:24.223417+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63315968 unmapped: 638976 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:25.223654+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63315968 unmapped: 638976 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:26.223929+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63324160 unmapped: 630784 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:27.224146+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63324160 unmapped: 630784 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:28.224424+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63332352 unmapped: 622592 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:29.224669+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63332352 unmapped: 622592 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:30.224846+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63332352 unmapped: 622592 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:31.224976+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63340544 unmapped: 614400 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:32.225176+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63348736 unmapped: 606208 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:33.225374+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63356928 unmapped: 598016 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:34.225613+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63356928 unmapped: 598016 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:35.225796+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63373312 unmapped: 581632 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:36.225990+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63381504 unmapped: 573440 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:37.226144+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63389696 unmapped: 565248 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:38.226328+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63389696 unmapped: 565248 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:39.226511+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63397888 unmapped: 557056 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:40.226687+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63406080 unmapped: 548864 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:41.226875+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63406080 unmapped: 548864 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:42.227028+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63422464 unmapped: 532480 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:43.227298+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63422464 unmapped: 532480 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:44.227541+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63430656 unmapped: 524288 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:45.227754+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63430656 unmapped: 524288 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:46.227947+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63438848 unmapped: 516096 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:47.228103+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63438848 unmapped: 516096 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:48.228255+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63438848 unmapped: 516096 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:49.228418+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63447040 unmapped: 507904 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:50.228563+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63447040 unmapped: 507904 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:51.228794+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63463424 unmapped: 491520 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:52.228961+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63463424 unmapped: 491520 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:53.229141+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63463424 unmapped: 491520 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:54.229291+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63471616 unmapped: 483328 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:55.229478+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63471616 unmapped: 483328 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:56.229634+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63479808 unmapped: 475136 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:57.229816+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63488000 unmapped: 466944 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:58.230014+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63488000 unmapped: 466944 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:59.230164+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63496192 unmapped: 458752 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:00.230382+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63496192 unmapped: 458752 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:01.230602+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63496192 unmapped: 458752 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:02.230777+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63504384 unmapped: 450560 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:03.230978+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63504384 unmapped: 450560 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:04.231132+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63512576 unmapped: 442368 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:05.231247+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63520768 unmapped: 434176 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:06.231377+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63520768 unmapped: 434176 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:07.231521+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:08.231692+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63528960 unmapped: 425984 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:09.231863+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63528960 unmapped: 425984 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:10.231998+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63528960 unmapped: 425984 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:11.232166+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63545344 unmapped: 409600 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:12.232293+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63545344 unmapped: 409600 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:13.232469+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63553536 unmapped: 401408 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:14.232589+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63553536 unmapped: 401408 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:15.232712+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63561728 unmapped: 393216 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:16.232922+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63578112 unmapped: 376832 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:17.233139+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63578112 unmapped: 376832 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:18.233297+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63586304 unmapped: 368640 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:19.233542+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63586304 unmapped: 368640 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:20.233839+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63594496 unmapped: 360448 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:21.234031+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63594496 unmapped: 360448 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:22.234161+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63594496 unmapped: 360448 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:23.234305+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 352256 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:24.234495+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 352256 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:25.234705+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63602688 unmapped: 352256 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:26.234945+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63627264 unmapped: 327680 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:27.235153+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63627264 unmapped: 327680 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:28.235383+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63635456 unmapped: 319488 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:29.235674+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63635456 unmapped: 319488 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:30.235850+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63635456 unmapped: 319488 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:31.236095+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63651840 unmapped: 303104 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:32.236353+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63651840 unmapped: 303104 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:33.236630+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63651840 unmapped: 303104 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:34.236917+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63660032 unmapped: 294912 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:35.237081+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63660032 unmapped: 294912 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:36.237268+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63684608 unmapped: 270336 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:37.237434+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63684608 unmapped: 270336 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:38.237654+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63692800 unmapped: 262144 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:39.237834+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63692800 unmapped: 262144 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:40.237980+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63692800 unmapped: 262144 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:41.238135+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63700992 unmapped: 253952 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:42.238272+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63700992 unmapped: 253952 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:43.238466+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63709184 unmapped: 245760 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:44.238680+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63725568 unmapped: 229376 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:45.238820+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63725568 unmapped: 229376 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:46.238976+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63733760 unmapped: 221184 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:47.239109+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63733760 unmapped: 221184 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:48.239254+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63733760 unmapped: 221184 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:49.239515+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63741952 unmapped: 212992 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:50.239728+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63741952 unmapped: 212992 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:51.239891+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63733760 unmapped: 221184 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:52.240102+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63741952 unmapped: 212992 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:53.240327+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63741952 unmapped: 212992 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:54.240508+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63750144 unmapped: 204800 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:55.240675+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63750144 unmapped: 204800 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:56.240836+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63766528 unmapped: 188416 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:57.240964+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63774720 unmapped: 180224 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:58.241119+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63791104 unmapped: 163840 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:59.241262+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63799296 unmapped: 155648 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:00.241395+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63799296 unmapped: 155648 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:01.241523+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63807488 unmapped: 147456 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:02.241676+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63807488 unmapped: 147456 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:03.241840+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63807488 unmapped: 147456 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:04.241974+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63815680 unmapped: 139264 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:05.242090+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63815680 unmapped: 139264 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:06.242263+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63823872 unmapped: 131072 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:07.242432+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63832064 unmapped: 122880 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:08.242600+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63832064 unmapped: 122880 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:09.242797+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63840256 unmapped: 114688 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:10.242944+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63840256 unmapped: 114688 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:11.243138+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63848448 unmapped: 106496 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:12.243268+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63848448 unmapped: 106496 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:13.243691+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63848448 unmapped: 106496 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:14.243902+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63856640 unmapped: 98304 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:15.244047+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63856640 unmapped: 98304 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:16.244272+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63856640 unmapped: 98304 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:17.244477+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63864832 unmapped: 90112 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:18.244650+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63864832 unmapped: 90112 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:19.244783+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63873024 unmapped: 81920 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:20.244930+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63873024 unmapped: 81920 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:21.245081+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63873024 unmapped: 81920 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:22.245228+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63881216 unmapped: 73728 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:23.245491+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63881216 unmapped: 73728 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:24.245702+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63881216 unmapped: 73728 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:25.245890+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63889408 unmapped: 65536 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:26.246133+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63889408 unmapped: 65536 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:27.246408+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63897600 unmapped: 57344 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:28.246623+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63897600 unmapped: 57344 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:29.246969+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63897600 unmapped: 57344 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:30.247151+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63905792 unmapped: 49152 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:31.247377+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63913984 unmapped: 40960 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:32.247684+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63922176 unmapped: 32768 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:33.247911+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63922176 unmapped: 32768 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:34.248176+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63930368 unmapped: 24576 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:35.248424+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63930368 unmapped: 24576 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:36.248693+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63930368 unmapped: 24576 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:37.248884+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63938560 unmapped: 16384 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:38.249145+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63938560 unmapped: 16384 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:39.249427+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63946752 unmapped: 8192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:40.249688+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63946752 unmapped: 8192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:41.249995+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63946752 unmapped: 8192 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:42.250297+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63954944 unmapped: 0 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:43.250631+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63954944 unmapped: 0 heap: 63954944 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:44.250800+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63963136 unmapped: 1040384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:45.251004+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63963136 unmapped: 1040384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:46.251179+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63971328 unmapped: 1032192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:47.251311+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63971328 unmapped: 1032192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:48.251462+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63971328 unmapped: 1032192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:49.251696+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63979520 unmapped: 1024000 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:50.251947+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63979520 unmapped: 1024000 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:51.252205+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63979520 unmapped: 1024000 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:52.252499+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63995904 unmapped: 1007616 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:53.252779+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 63995904 unmapped: 1007616 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:54.252984+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64004096 unmapped: 999424 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:55.253201+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64004096 unmapped: 999424 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:56.253413+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64004096 unmapped: 999424 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:57.253636+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64012288 unmapped: 991232 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:58.253796+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64012288 unmapped: 991232 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:59.254003+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64020480 unmapped: 983040 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:00.254185+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64020480 unmapped: 983040 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:01.254369+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64020480 unmapped: 983040 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:02.254513+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64028672 unmapped: 974848 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:03.254670+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64028672 unmapped: 974848 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:04.254872+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64036864 unmapped: 966656 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:05.255034+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64036864 unmapped: 966656 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:06.255194+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64036864 unmapped: 966656 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:07.255358+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64045056 unmapped: 958464 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:08.255547+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64045056 unmapped: 958464 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:09.255697+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64045056 unmapped: 958464 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:10.255847+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64053248 unmapped: 950272 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:11.255998+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64053248 unmapped: 950272 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:12.256145+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64061440 unmapped: 942080 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:13.256315+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64061440 unmapped: 942080 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:14.256493+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:15.256647+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:16.256835+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64061440 unmapped: 942080 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:17.256982+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:18.257119+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:19.257235+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64069632 unmapped: 933888 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:20.257389+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64077824 unmapped: 925696 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:21.257548+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64077824 unmapped: 925696 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:22.257692+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64086016 unmapped: 917504 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:23.257910+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64086016 unmapped: 917504 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:24.258098+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64094208 unmapped: 909312 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:25.258221+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64094208 unmapped: 909312 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:26.258364+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64102400 unmapped: 901120 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:27.258498+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64102400 unmapped: 901120 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:28.258633+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64102400 unmapped: 901120 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:29.258794+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64110592 unmapped: 892928 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:30.258944+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64110592 unmapped: 892928 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:31.259126+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64110592 unmapped: 892928 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:32.259316+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64118784 unmapped: 884736 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:33.259494+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64118784 unmapped: 884736 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:34.259657+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64126976 unmapped: 876544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:35.259847+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64126976 unmapped: 876544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:36.259996+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64126976 unmapped: 876544 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:37.260189+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64135168 unmapped: 868352 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:38.260369+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64135168 unmapped: 868352 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:39.260516+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64143360 unmapped: 860160 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:40.260669+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64143360 unmapped: 860160 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:41.260844+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64143360 unmapped: 860160 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:42.260986+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64151552 unmapped: 851968 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:43.261161+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64151552 unmapped: 851968 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:44.261302+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64159744 unmapped: 843776 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:45.261522+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64159744 unmapped: 843776 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:46.285707+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64159744 unmapped: 843776 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:47.285856+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64167936 unmapped: 835584 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:48.285961+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64167936 unmapped: 835584 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:49.286122+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64167936 unmapped: 835584 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:50.286264+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64176128 unmapped: 827392 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:51.286373+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64176128 unmapped: 827392 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:52.286482+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64184320 unmapped: 819200 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:53.286664+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64184320 unmapped: 819200 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:54.286814+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64184320 unmapped: 819200 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:55.286939+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64192512 unmapped: 811008 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:56.287073+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64192512 unmapped: 811008 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:57.287214+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64192512 unmapped: 811008 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:58.287330+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64200704 unmapped: 802816 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:59.287484+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64200704 unmapped: 802816 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:00.287744+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64208896 unmapped: 794624 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:01.287995+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64208896 unmapped: 794624 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:02.288137+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64217088 unmapped: 786432 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:03.288410+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64217088 unmapped: 786432 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:04.288563+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64217088 unmapped: 786432 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:05.288739+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64225280 unmapped: 778240 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:06.288923+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64225280 unmapped: 778240 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:07.289100+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64233472 unmapped: 770048 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:08.289264+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64233472 unmapped: 770048 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:09.289390+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 761856 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:10.289584+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 761856 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:11.289734+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64241664 unmapped: 761856 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:12.289889+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64249856 unmapped: 753664 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:13.290935+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64249856 unmapped: 753664 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:14.291094+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64249856 unmapped: 753664 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:15.291205+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64258048 unmapped: 745472 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:16.291356+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64258048 unmapped: 745472 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:17.291498+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64266240 unmapped: 737280 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:18.291656+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64266240 unmapped: 737280 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:19.291791+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64266240 unmapped: 737280 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:20.291925+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64274432 unmapped: 729088 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:21.292086+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64274432 unmapped: 729088 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:22.292317+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64274432 unmapped: 729088 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:23.292493+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64282624 unmapped: 720896 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:24.292636+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64282624 unmapped: 720896 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:25.292813+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64290816 unmapped: 712704 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:26.292960+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64290816 unmapped: 712704 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:27.293134+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64290816 unmapped: 712704 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:28.293271+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64299008 unmapped: 704512 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:29.293460+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64299008 unmapped: 704512 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:30.293607+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64299008 unmapped: 704512 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:31.293738+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64307200 unmapped: 696320 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:32.293856+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64307200 unmapped: 696320 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:33.293996+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64315392 unmapped: 688128 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:34.294150+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64315392 unmapped: 688128 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:35.294330+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64323584 unmapped: 679936 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:36.294498+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64323584 unmapped: 679936 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:37.294677+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64323584 unmapped: 679936 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:38.294815+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64331776 unmapped: 671744 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:39.294948+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64331776 unmapped: 671744 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:40.295083+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64339968 unmapped: 663552 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:41.295203+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64339968 unmapped: 663552 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:42.295344+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64339968 unmapped: 663552 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:43.295565+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64348160 unmapped: 655360 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:44.295800+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64348160 unmapped: 655360 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:45.295951+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64348160 unmapped: 655360 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:46.296159+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64356352 unmapped: 647168 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:47.296316+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64356352 unmapped: 647168 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:48.296512+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64364544 unmapped: 638976 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:49.296732+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64364544 unmapped: 638976 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:50.296864+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64364544 unmapped: 638976 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:51.296977+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64372736 unmapped: 630784 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:52.297084+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64372736 unmapped: 630784 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:53.297221+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64372736 unmapped: 630784 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:54.297447+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64380928 unmapped: 622592 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:55.297659+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64380928 unmapped: 622592 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:56.297811+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64389120 unmapped: 614400 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:57.298001+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64389120 unmapped: 614400 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:58.298115+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64397312 unmapped: 606208 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:59.298321+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64397312 unmapped: 606208 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:00.298479+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64397312 unmapped: 606208 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:01.298698+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64405504 unmapped: 598016 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:02.298873+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64405504 unmapped: 598016 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:03.299077+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64405504 unmapped: 598016 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:04.299321+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:05.299491+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:06.299666+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64413696 unmapped: 589824 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:07.299876+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 581632 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:08.300044+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64421888 unmapped: 581632 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:09.300254+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64430080 unmapped: 573440 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:10.300459+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64430080 unmapped: 573440 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:11.300665+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64430080 unmapped: 573440 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:12.300830+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64438272 unmapped: 565248 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:13.301021+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64438272 unmapped: 565248 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:14.301189+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64446464 unmapped: 557056 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:15.301373+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64446464 unmapped: 557056 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:16.301507+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64454656 unmapped: 548864 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:17.301705+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64454656 unmapped: 548864 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:18.301851+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64454656 unmapped: 548864 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:19.301972+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64462848 unmapped: 540672 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:20.302088+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64471040 unmapped: 532480 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:21.302202+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64471040 unmapped: 532480 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:22.302343+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64479232 unmapped: 524288 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:23.302884+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64479232 unmapped: 524288 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:24.303018+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64479232 unmapped: 524288 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:25.303190+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64487424 unmapped: 516096 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:26.303332+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64487424 unmapped: 516096 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:27.303464+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64495616 unmapped: 507904 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:28.303603+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64495616 unmapped: 507904 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:29.303729+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64503808 unmapped: 499712 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:30.303864+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64503808 unmapped: 499712 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:31.304072+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64512000 unmapped: 491520 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:32.304225+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64512000 unmapped: 491520 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:33.304428+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64512000 unmapped: 491520 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:34.304638+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64520192 unmapped: 483328 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:35.304781+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64520192 unmapped: 483328 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:36.304946+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64528384 unmapped: 475136 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:37.305129+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64528384 unmapped: 475136 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:38.305252+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64528384 unmapped: 475136 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:39.305382+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64536576 unmapped: 466944 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:40.305550+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64536576 unmapped: 466944 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:41.305781+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64536576 unmapped: 466944 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:42.305958+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64544768 unmapped: 458752 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:43.306159+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64544768 unmapped: 458752 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:44.306316+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64552960 unmapped: 450560 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:45.306521+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64552960 unmapped: 450560 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:46.306642+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64552960 unmapped: 450560 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:47.306799+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64561152 unmapped: 442368 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:48.306949+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64561152 unmapped: 442368 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:49.307152+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64569344 unmapped: 434176 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:50.307302+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64569344 unmapped: 434176 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:51.307514+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64569344 unmapped: 434176 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:52.307655+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64577536 unmapped: 425984 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:53.307832+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64577536 unmapped: 425984 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:54.307999+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64585728 unmapped: 417792 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:55.308190+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64585728 unmapped: 417792 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:56.308397+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64585728 unmapped: 417792 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:57.308714+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64593920 unmapped: 409600 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:58.308881+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64593920 unmapped: 409600 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:59.309097+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 401408 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:00.309220+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:01.309334+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 401408 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:02.309607+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64602112 unmapped: 401408 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:03.309796+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64610304 unmapped: 393216 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:04.309928+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64610304 unmapped: 393216 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:05.310048+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64618496 unmapped: 385024 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:06.310160+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64618496 unmapped: 385024 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:07.310441+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64618496 unmapped: 385024 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:08.310649+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64626688 unmapped: 376832 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:09.310819+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64626688 unmapped: 376832 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:10.310969+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64634880 unmapped: 368640 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:11.311133+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64634880 unmapped: 368640 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:12.311285+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64634880 unmapped: 368640 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:13.311491+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64643072 unmapped: 360448 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:14.311685+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64643072 unmapped: 360448 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:15.311841+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64643072 unmapped: 360448 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:16.312011+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64651264 unmapped: 352256 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:17.312120+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64651264 unmapped: 352256 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:18.312281+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64659456 unmapped: 344064 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:19.312438+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64659456 unmapped: 344064 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:20.312513+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64659456 unmapped: 344064 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:21.312661+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64667648 unmapped: 335872 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:22.312783+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64667648 unmapped: 335872 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:23.312928+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64675840 unmapped: 327680 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:24.313170+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64675840 unmapped: 327680 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:25.313363+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64675840 unmapped: 327680 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:26.313502+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64684032 unmapped: 319488 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:27.313685+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64684032 unmapped: 319488 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:28.313809+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64684032 unmapped: 319488 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:29.313934+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64692224 unmapped: 311296 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:30.314309+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64692224 unmapped: 311296 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:31.314469+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64692224 unmapped: 311296 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:32.314652+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64692224 unmapped: 311296 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:33.314871+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64700416 unmapped: 303104 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:34.315113+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64700416 unmapped: 303104 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:35.315239+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64700416 unmapped: 303104 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:36.315383+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64708608 unmapped: 294912 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:37.315554+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64708608 unmapped: 294912 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:38.315737+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64716800 unmapped: 286720 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:39.315899+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64716800 unmapped: 286720 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:40.316093+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64716800 unmapped: 286720 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:41.316272+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64724992 unmapped: 278528 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:42.316436+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64724992 unmapped: 278528 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:43.316674+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64724992 unmapped: 278528 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:44.316899+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64733184 unmapped: 270336 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:45.317059+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64733184 unmapped: 270336 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:46.317193+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64733184 unmapped: 270336 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:47.317388+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64741376 unmapped: 262144 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:48.317541+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64741376 unmapped: 262144 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 16.19 MB, 0.03 MB/s
                                           Interval WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:49.317647+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64806912 unmapped: 196608 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:50.317780+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64806912 unmapped: 196608 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:51.318036+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64815104 unmapped: 188416 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:52.318313+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64815104 unmapped: 188416 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:53.318762+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64815104 unmapped: 188416 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:54.318927+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64823296 unmapped: 180224 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:55.319419+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64823296 unmapped: 180224 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:56.319629+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64831488 unmapped: 172032 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:57.320022+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64831488 unmapped: 172032 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:58.320160+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64839680 unmapped: 163840 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:59.320361+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64839680 unmapped: 163840 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:00.320631+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64839680 unmapped: 163840 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:01.320885+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64847872 unmapped: 155648 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:02.321111+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64847872 unmapped: 155648 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:03.321336+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64847872 unmapped: 155648 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:04.321743+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64856064 unmapped: 147456 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:05.322130+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64856064 unmapped: 147456 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:06.322616+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64864256 unmapped: 139264 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:07.322942+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64864256 unmapped: 139264 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:08.323180+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64864256 unmapped: 139264 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:09.323454+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64872448 unmapped: 131072 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:10.323873+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64872448 unmapped: 131072 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:11.324231+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64880640 unmapped: 122880 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:12.324538+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64880640 unmapped: 122880 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:13.324929+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64880640 unmapped: 122880 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:14.325237+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64888832 unmapped: 114688 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:15.325679+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64888832 unmapped: 114688 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:16.325900+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64897024 unmapped: 106496 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:17.326133+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64897024 unmapped: 106496 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:18.326383+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64897024 unmapped: 106496 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:19.326749+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64905216 unmapped: 98304 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:20.327026+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64905216 unmapped: 98304 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:21.327327+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64921600 unmapped: 81920 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:22.327736+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64921600 unmapped: 81920 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:23.328077+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64921600 unmapped: 81920 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:24.328340+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64929792 unmapped: 73728 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:25.328707+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64929792 unmapped: 73728 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:26.328931+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64937984 unmapped: 65536 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:27.329189+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64937984 unmapped: 65536 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:28.329428+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64937984 unmapped: 65536 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:29.329696+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64946176 unmapped: 57344 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:30.330020+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64946176 unmapped: 57344 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:31.330305+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64946176 unmapped: 57344 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:32.330641+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64954368 unmapped: 49152 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:33.330922+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64954368 unmapped: 49152 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:34.331191+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64962560 unmapped: 40960 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:35.331372+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64962560 unmapped: 40960 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:36.331519+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64970752 unmapped: 32768 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:37.331723+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64970752 unmapped: 32768 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:38.331976+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64970752 unmapped: 32768 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:39.332255+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64978944 unmapped: 24576 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:40.332669+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64978944 unmapped: 24576 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:41.332946+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64987136 unmapped: 16384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:42.333236+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64987136 unmapped: 16384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:43.334002+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64987136 unmapped: 16384 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:44.334245+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64995328 unmapped: 8192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:45.334515+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 64995328 unmapped: 8192 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:46.334950+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65003520 unmapped: 0 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:47.335199+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65003520 unmapped: 0 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:48.335400+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65003520 unmapped: 0 heap: 65003520 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:49.335725+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65011712 unmapped: 1040384 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:50.335975+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65011712 unmapped: 1040384 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:51.336267+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65019904 unmapped: 1032192 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:52.336590+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65028096 unmapped: 1024000 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:53.337000+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65028096 unmapped: 1024000 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:54.338188+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65036288 unmapped: 1015808 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:55.339313+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65036288 unmapped: 1015808 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:56.339618+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65036288 unmapped: 1015808 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:57.339810+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65044480 unmapped: 1007616 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:58.339977+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65044480 unmapped: 1007616 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:59.340141+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65044480 unmapped: 1007616 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:00.340289+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65052672 unmapped: 999424 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:01.340438+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65060864 unmapped: 991232 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:02.340560+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65060864 unmapped: 991232 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:03.340844+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65060864 unmapped: 991232 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:04.341014+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65069056 unmapped: 983040 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:05.341159+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65069056 unmapped: 983040 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:06.341316+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65069056 unmapped: 983040 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:07.341523+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65077248 unmapped: 974848 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:08.341641+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65085440 unmapped: 966656 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:09.341782+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65085440 unmapped: 966656 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:10.341964+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65085440 unmapped: 966656 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:11.342132+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65093632 unmapped: 958464 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:12.342397+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65093632 unmapped: 958464 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:13.342637+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65101824 unmapped: 950272 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:14.342808+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65101824 unmapped: 950272 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:15.342993+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65101824 unmapped: 950272 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:16.343153+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65110016 unmapped: 942080 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:17.343311+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65110016 unmapped: 942080 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:18.343446+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65110016 unmapped: 942080 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:19.343590+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:20.343767+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:21.343932+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:22.344079+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:23.344251+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65118208 unmapped: 933888 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:24.344411+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65126400 unmapped: 925696 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:25.344668+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65126400 unmapped: 925696 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:26.345011+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65134592 unmapped: 917504 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:27.345149+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65134592 unmapped: 917504 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:28.345358+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65134592 unmapped: 917504 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:29.345543+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65142784 unmapped: 909312 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:30.345730+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65142784 unmapped: 909312 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:31.345889+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65150976 unmapped: 901120 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:32.346047+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65150976 unmapped: 901120 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:33.346207+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65150976 unmapped: 901120 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:34.346354+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65159168 unmapped: 892928 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:35.346493+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65159168 unmapped: 892928 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:36.346709+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:37.346936+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:38.347061+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:39.377242+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:40.377378+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:41.377535+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:42.377946+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65167360 unmapped: 884736 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:43.378662+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:44.378834+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:45.378986+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65175552 unmapped: 876544 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:46.379711+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:47.379871+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:48.380059+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65183744 unmapped: 868352 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:49.380191+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:50.380357+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:51.380527+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:52.380684+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:53.381637+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:54.382020+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:55.382769+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:56.382996+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:57.383314+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:58.383533+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:59.383675+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:00.383854+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:01.383992+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:02.384146+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:03.384417+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:04.384601+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:05.384734+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:06.384892+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:07.385084+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:08.385252+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:09.385425+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65191936 unmapped: 860160 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:10.385733+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:11.386048+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:12.386244+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:13.386543+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:14.386641+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:15.386902+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:16.387044+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:17.387212+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:18.387417+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:19.387634+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65200128 unmapped: 851968 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:20.387767+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:21.387923+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:22.388113+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:23.388323+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:24.388453+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:25.388626+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:26.388841+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:27.389109+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:28.389371+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:29.389681+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:30.389972+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:31.390212+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:32.390430+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:33.390749+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:34.391000+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:35.391229+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:36.391487+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:37.391646+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:38.391801+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:39.392019+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:40.392260+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:41.392471+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:42.392696+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:43.392946+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:44.393130+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:45.393318+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:46.393494+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:47.393627+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:48.393990+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:49.394157+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:50.394362+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65208320 unmapped: 843776 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:51.394722+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:52.394898+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:53.395091+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:54.395242+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:55.395430+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:56.395644+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:57.395805+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:58.396010+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:59.396226+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:00.396476+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:01.396782+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:02.396950+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:03.397130+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:04.397336+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:05.397506+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:06.397655+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:07.397805+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:08.397935+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:09.398048+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:10.398163+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:11.398382+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:12.398542+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:13.398779+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:14.398900+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:15.399033+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:16.399187+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:17.399322+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:18.399477+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:19.399622+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:20.399761+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:21.399908+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:22.400027+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:23.400196+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65216512 unmapped: 835584 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:24.400362+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:25.400514+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:26.400701+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:27.400803+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:28.400940+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:29.401140+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:30.401384+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:31.401680+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:32.402027+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:33.402201+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:34.402351+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:35.402491+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:36.402643+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:37.402826+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:38.402937+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:39.403172+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:40.403425+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65224704 unmapped: 827392 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:41.403642+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:42.403791+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:43.403996+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:44.404178+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:45.404767+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:46.405253+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:47.405448+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:48.405631+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:49.405884+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:50.406039+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:51.406225+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65232896 unmapped: 819200 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:52.406376+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:53.406634+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:54.406875+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:55.407032+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:56.407222+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:57.407383+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:58.407519+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:59.407683+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:00.407836+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:01.407972+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:02.408134+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:03.408431+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:04.408620+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:05.408758+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:06.408899+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:07.409025+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:08.409273+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:09.409499+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:10.409641+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:11.409782+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:12.409942+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65241088 unmapped: 811008 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:13.410319+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:14.410483+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:15.410651+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:16.410832+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:17.410990+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:18.411128+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:19.411291+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:20.411454+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:21.411658+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:22.411900+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:23.412156+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:24.412358+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:25.412536+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:26.412695+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:27.412851+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:28.413021+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:29.413179+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:30.413332+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:31.413503+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:32.413688+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:33.413896+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:34.414102+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:35.414306+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:36.414562+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:37.414818+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:38.414976+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:39.415192+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:40.415349+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:41.415536+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:42.415644+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:43.415822+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:44.416273+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:45.416484+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:46.416642+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:47.416754+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:48.417010+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:49.417158+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:50.417318+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:51.417460+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65249280 unmapped: 802816 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:52.417600+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:53.418475+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:54.418666+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:55.418862+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:56.419009+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:57.419127+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:58.419285+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:59.419412+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:00.419606+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:01.419753+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:02.419905+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:03.420075+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:04.420202+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:05.420374+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:06.420530+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:07.420794+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:08.421056+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:09.421261+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:10.421456+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:11.421636+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:12.421823+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:13.422003+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:14.422200+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:15.422398+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:16.422562+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:17.422725+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:18.422981+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:19.423210+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:20.432197+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:21.432446+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:22.432653+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:23.432869+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:24.433053+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:25.433188+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:26.433520+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:27.433781+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:28.434017+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:29.434183+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:30.434323+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:31.434535+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:32.434759+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:33.434956+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:34.435136+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:35.435443+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:36.435643+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:37.435952+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:38.436079+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:39.436236+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:40.436448+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:41.436690+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:42.436830+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:43.437123+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:44.437244+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:45.437399+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:46.437643+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:47.437837+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:48.438009+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:49.438225+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:50.438452+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:51.438669+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:52.438800+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:53.438983+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:54.439141+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:55.439362+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:56.439495+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:57.439616+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:58.439776+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:59.439894+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:00.440069+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:01.440336+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:02.440508+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:03.440774+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:04.440981+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:05.441180+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:06.441340+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:07.441527+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:08.441704+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:09.441886+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:10.442023+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:11.442269+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:12.442461+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:13.442650+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:14.442831+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:15.442979+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:16.443185+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:17.443332+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:18.443497+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:19.443632+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:20.443780+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:21.443944+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:22.444068+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:23.444247+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:24.444402+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:25.444633+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:26.444792+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:27.444953+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:28.445150+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:29.445276+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:30.445467+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:31.445645+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:32.445805+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:33.445995+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:34.446139+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:35.446313+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:36.446545+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:37.446673+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:38.446823+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:39.446979+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:40.447145+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:41.447387+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:42.447547+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:43.447777+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:44.447965+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:45.448124+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:46.448356+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:47.448525+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:48.448670+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:49.448795+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:50.448956+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:51.449131+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:52.449315+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:53.449499+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:54.449674+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:55.449808+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:56.450019+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:57.450139+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:58.450324+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:59.450507+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:00.450757+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:01.450913+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:02.451071+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:03.451278+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:04.451424+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:05.451763+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:06.451922+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:07.452079+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:08.452243+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:09.452380+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:10.452611+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:11.452782+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:12.452957+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:13.453154+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:14.453309+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:15.453498+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:16.453646+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:17.453770+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:18.453952+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:19.454093+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:20.454239+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:21.454398+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:22.454523+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:23.454657+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:24.454780+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:25.454905+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:26.455087+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:27.455205+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:28.455305+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:29.455497+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:30.455643+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:31.455847+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:32.455966+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:33.456201+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:34.456360+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:35.456551+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:36.456741+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:37.456915+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:38.457046+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:39.457227+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:40.457390+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:41.457545+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:42.457630+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:43.457796+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:44.457936+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:45.458097+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:46.458319+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:47.458451+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:48.458663+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:49.458845+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:50.459081+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:51.459347+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:52.459482+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:53.459692+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:54.459857+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:55.460025+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 761856 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:56.460238+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:57.460449+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:58.460676+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:59.460811+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:00.461012+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:01.461262+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:02.461421+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:03.461646+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:04.461866+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:05.462089+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:06.462263+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:07.462473+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:08.462654+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:09.462823+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:10.462954+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:11.463169+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:12.463381+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:13.463603+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:14.463803+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:15.463950+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:16.464114+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:17.464356+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:18.464519+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:19.464652+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:20.464829+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:21.465012+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:22.465206+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:23.465396+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:24.465521+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:25.465719+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:26.465932+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:27.466119+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:28.466323+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:29.466541+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:30.466710+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:31.466866+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:32.467031+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:33.467254+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:34.467431+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:35.467669+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 753664 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:36.467867+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:37.468062+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:38.468216+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:39.468403+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:40.468546+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:41.468768+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:42.468925+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:43.469097+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:44.469286+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:45.469460+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:46.469661+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:47.469830+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:48.469975+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:49.470136+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:50.470349+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:51.470624+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:52.470851+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:53.471054+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:54.471229+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:55.471389+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:56.471551+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:57.471757+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:58.471910+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:59.472028+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:00.472172+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:01.472325+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:02.472504+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:03.472735+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:04.472860+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:05.473039+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:06.473229+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:07.473390+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:08.473631+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:09.473840+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:10.474002+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:11.474155+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:12.474335+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:13.474554+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:14.474769+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:15.474946+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:16.475218+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:17.475397+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:18.475671+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:19.475959+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:20.476198+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:21.476487+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 745472 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:22.476862+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:23.477230+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:24.477520+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:25.477817+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:26.478047+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:27.478304+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:28.478548+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:29.478863+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:30.479105+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:31.479276+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:32.479440+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:33.479605+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:34.479773+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:35.479921+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:36.480104+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:37.480307+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:38.480680+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:39.480804+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:40.481012+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:41.481247+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:42.481471+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:43.481648+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:44.481793+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:45.481941+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:46.482112+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:47.482276+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:48.482474+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65314816 unmapped: 737280 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4150 writes, 19K keys, 4150 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4150 writes, 366 syncs, 11.34 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.010       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b9a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.008       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x559f074b98d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.4e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:49.482676+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65347584 unmapped: 704512 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:50.482862+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65347584 unmapped: 704512 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:51.483003+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:52.483165+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:53.483387+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:54.483685+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:55.484379+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:56.484801+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:57.484972+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:58.485823+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:59.486466+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:00.486895+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:01.487279+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:02.487553+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:03.488249+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:04.488781+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:05.489233+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:06.489648+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:07.490064+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:08.490351+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:09.490700+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:10.490970+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:11.491238+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:12.491453+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:13.491755+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:14.491985+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:15.492276+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:16.492516+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:17.492689+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:18.492949+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:19.493137+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:20.493306+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:21.493497+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:22.493703+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:23.493910+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:24.494117+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:25.494300+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:26.494456+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:27.494618+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:28.494957+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:29.495240+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:30.495398+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:31.495601+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:32.495789+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:33.496080+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:34.496551+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:35.496935+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:36.497396+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:37.497545+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65355776 unmapped: 696320 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:38.497845+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:39.498052+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:40.498200+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:41.498383+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:42.498548+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:43.498760+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:44.498946+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65363968 unmapped: 688128 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:45.499140+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:46.499307+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:47.499500+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:48.499653+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:49.499887+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:50.500029+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:51.500339+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:52.500505+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:53.500805+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:54.500991+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:55.501159+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:56.501341+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:57.501525+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:58.501704+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:59.501895+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:00.502046+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:01.502292+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:02.502503+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:03.502697+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:04.502875+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:05.537111+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:06.537290+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:07.537509+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:08.537781+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:09.538011+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:10.538253+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:11.538482+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:12.538659+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:13.538891+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:14.539072+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:15.539355+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:16.539536+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:17.540042+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:18.540282+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47a11/0xac000, compress 0x0/0x0/0x0, omap 0x7a19, meta 0x1a285e7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 483779 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65372160 unmapped: 679936 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08fd3c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:19.540451+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 62 handle_osd_map epochs [62,63], i have 62, src has [1,63]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 1103.114379883s of 1103.126831055s, submitted: 6
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65503232 unmapped: 548864 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:20.540647+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65617920 unmapped: 17219584 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:21.540831+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 64 handle_osd_map epochs [64,65], i have 64, src has [1,65]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 65 ms_handle_reset con 0x559f08fd3c00 session 0x559f09d6b500
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 17137664 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:22.541033+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 65 heartbeat osd_stat(store_statfs(0x4fd914000/0x0/0x4ffc00000, data 0x84bbf8/0x8b6000, compress 0x0/0x0/0x0, omap 0x85f8, meta 0x1a27a08), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 17137664 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:23.541282+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09d9a800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 539076 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 17006592 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:24.541527+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 65 handle_osd_map epochs [65,66], i have 65, src has [1,66]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 66 ms_handle_reset con 0x559f09d9a800 session 0x559f0b0ea380
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:25.541691+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:26.541837+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:27.542036+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:28.542222+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 66 heartbeat osd_stat(store_statfs(0x4fd910000/0x0/0x4ffc00000, data 0x84d204/0x8ba000, compress 0x0/0x0/0x0, omap 0x8924, meta 0x1a276dc), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 543588 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:29.542536+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:30.542751+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 66 heartbeat osd_stat(store_statfs(0x4fd910000/0x0/0x4ffc00000, data 0x84d204/0x8ba000, compress 0x0/0x0/0x0, omap 0x8924, meta 0x1a276dc), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:31.542959+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:32.543168+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:33.543334+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 543588 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:34.543517+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:35.543728+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 16982016 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.887508392s of 16.273990631s, submitted: 31
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:36.543938+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:37.544198+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:38.544361+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:39.544597+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:40.544803+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:41.545010+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:42.545179+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:43.545390+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:44.545541+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:45.545695+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:46.545898+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:47.546137+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:48.546389+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:49.546632+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:50.546752+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:51.546974+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:52.547145+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:53.547467+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:54.547655+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:55.547832+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:56.548022+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:57.548268+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:58.548482+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 546216 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:59.548754+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 heartbeat osd_stat(store_statfs(0x4fd90d000/0x0/0x4ffc00000, data 0x84e6b4/0x8bd000, compress 0x0/0x0/0x0, omap 0x8bf3, meta 0x1a2740d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:00.548902+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:01.549136+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 16965632 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.484739304s of 26.491054535s, submitted: 13
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:02.549344+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66101248 unmapped: 16736256 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 68 ms_handle_reset con 0x559f0b2db800 session 0x559f0b1041c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:03.549563+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 16687104 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 552517 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:04.549765+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 16687104 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 68 heartbeat osd_stat(store_statfs(0x4fd908000/0x0/0x4ffc00000, data 0x8500a4/0x8c2000, compress 0x0/0x0/0x0, omap 0x8f21, meta 0x1a270df), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:05.549921+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66248704 unmapped: 16588800 heap: 82837504 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:06.550079+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 24756224 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:07.550238+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 24715264 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 69 ms_handle_reset con 0x559f0b2db400 session 0x559f0b09f6c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 69 heartbeat osd_stat(store_statfs(0x4fc10a000/0x0/0x4ffc00000, data 0x20500a4/0x20c2000, compress 0x0/0x0/0x0, omap 0x9267, meta 0x1a26d99), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:08.550372+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 24690688 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 70 ms_handle_reset con 0x559f0b2dbc00 session 0x559f09d6a540
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 563102 data_alloc: 218103808 data_used: 666
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08fd3c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:09.550511+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 23683072 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 70 heartbeat osd_stat(store_statfs(0x4fc105000/0x0/0x4ffc00000, data 0x2051671/0x20c5000, compress 0x0/0x0/0x0, omap 0x95e7, meta 0x1a26a19), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 70 handle_osd_map epochs [70,71], i have 70, src has [1,71]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 71 ms_handle_reset con 0x559f08fd3c00 session 0x559f09ce7180
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 71 ms_handle_reset con 0x559f0b2db400 session 0x559f0b0b1dc0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 71 heartbeat osd_stat(store_statfs(0x4fd8fe000/0x0/0x4ffc00000, data 0x852cc5/0x8ca000, compress 0x0/0x0/0x0, omap 0x9dc1, meta 0x1a2623f), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:10.550678+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67592192 unmapped: 23642112 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 71 ms_handle_reset con 0x559f0b2db800 session 0x559f0b0c9c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 71 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b0c8380
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:11.550860+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 23379968 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcb800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:12.550999+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 71 heartbeat osd_stat(store_statfs(0x4fd8fe000/0x0/0x4ffc00000, data 0x853ec3/0x8cb000, compress 0x0/0x0/0x0, omap 0xa06d, meta 0x1a25f93), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 23240704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 71 handle_osd_map epochs [72,72], i have 71, src has [1,72]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.315886497s of 10.602708817s, submitted: 111
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 72 ms_handle_reset con 0x559f09dcb800 session 0x559f0b0c8fc0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08fd3c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 72 ms_handle_reset con 0x559f08fd3c00 session 0x559f0af396c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 72 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b079a40
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:13.551225+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 23543808 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 572170 data_alloc: 218103808 data_used: 4743
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:14.551361+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 23289856 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 72 handle_osd_map epochs [72,73], i have 72, src has [1,73]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 73 ms_handle_reset con 0x559f08567c00 session 0x559f0b078c40
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 73 ms_handle_reset con 0x559f08566c00 session 0x559f09cc2700
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:15.551535+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af3bc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 22831104 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 73 handle_osd_map epochs [73,74], i have 73, src has [1,74]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 74 ms_handle_reset con 0x559f0af3bc00 session 0x559f09cc3a40
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 74 ms_handle_reset con 0x559f08566400 session 0x559f0b05cc40
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:16.551645+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 22675456 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:17.551763+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 22675456 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fd8f1000/0x0/0x4ffc00000, data 0x8595b7/0x8d5000, compress 0x0/0x0/0x0, omap 0xb3a3, meta 0x1a24c5d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:18.551975+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68558848 unmapped: 22675456 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 75 heartbeat osd_stat(store_statfs(0x4fd8f1000/0x0/0x4ffc00000, data 0x8595b7/0x8d5000, compress 0x0/0x0/0x0, omap 0xb3a3, meta 0x1a24c5d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 580126 data_alloc: 218103808 data_used: 4727
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:19.552165+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68591616 unmapped: 22642688 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af3bc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:20.552375+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 22831104 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 75 ms_handle_reset con 0x559f0af3bc00 session 0x559f0977ea80
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:21.552549+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68403200 unmapped: 22831104 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:22.552811+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68460544 unmapped: 22773760 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.737900734s of 10.061096191s, submitted: 136
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 77 ms_handle_reset con 0x559f08566c00 session 0x559f0977f880
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:23.552981+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 22659072 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 77 handle_osd_map epochs [77,78], i have 77, src has [1,78]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 78 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b0c9340
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 78 heartbeat osd_stat(store_statfs(0x4fd8eb000/0x0/0x4ffc00000, data 0x85d114/0x8df000, compress 0x0/0x0/0x0, omap 0xc01d, meta 0x1a23fe3), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 602228 data_alloc: 218103808 data_used: 12849
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:24.553181+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 78 heartbeat osd_stat(store_statfs(0x4fd8eb000/0x0/0x4ffc00000, data 0x85d114/0x8df000, compress 0x0/0x0/0x0, omap 0xc01d, meta 0x1a23fe3), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68624384 unmapped: 22609920 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 79 ms_handle_reset con 0x559f0b2dbc00 session 0x559f09ce7180
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:25.553332+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 68476928 unmapped: 22757376 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 80 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b0c9880
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:26.553469+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69566464 unmapped: 21667840 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 81 ms_handle_reset con 0x559f08567c00 session 0x559f0977ee00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:27.553631+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 21610496 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:28.553808+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 21610496 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 620604 data_alloc: 218103808 data_used: 12849
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:29.553965+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 21610496 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:30.554184+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 82 heartbeat osd_stat(store_statfs(0x4fd8d1000/0x0/0x4ffc00000, data 0x86555b/0x8f3000, compress 0x0/0x0/0x0, omap 0xd14b, meta 0x1a22eb5), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 21610496 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 82 handle_osd_map epochs [82,83], i have 82, src has [1,83]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 83 ms_handle_reset con 0x559f08566c00 session 0x559f09cc2c40
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:31.554353+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 20512768 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 84 ms_handle_reset con 0x559f08566400 session 0x559f09d6bdc0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:32.554615+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af3bc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 20488192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 84 handle_osd_map epochs [84,85], i have 84, src has [1,85]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.950626373s of 10.085215569s, submitted: 81
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 85 ms_handle_reset con 0x559f0af3bc00 session 0x559f0af39500
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:33.554789+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 19333120 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 85 handle_osd_map epochs [85,86], i have 85, src has [1,86]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 86 ms_handle_reset con 0x559f08566400 session 0x559f0af39c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:34.554974+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 625457 data_alloc: 218103808 data_used: 12849
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 19038208 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 87 ms_handle_reset con 0x559f08566c00 session 0x559f0af388c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 87 heartbeat osd_stat(store_statfs(0x4fd8d1000/0x0/0x4ffc00000, data 0x8695c0/0x8f9000, compress 0x0/0x0/0x0, omap 0xe12c, meta 0x1a21ed4), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:35.555107+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72351744 unmapped: 18882560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 87 handle_osd_map epochs [87,88], i have 87, src has [1,88]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 88 ms_handle_reset con 0x559f08567c00 session 0x559f09cc3180
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:36.555326+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72376320 unmapped: 18857984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:37.555517+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 18808832 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fd8ca000/0x0/0x4ffc00000, data 0x86b287/0x8fc000, compress 0x0/0x0/0x0, omap 0xeb23, meta 0x1a214dd), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:38.555749+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 18808832 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:39.555967+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 629504 data_alloc: 218103808 data_used: 12849
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 18808832 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:40.556177+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 18808832 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 88 heartbeat osd_stat(store_statfs(0x4fd8ca000/0x0/0x4ffc00000, data 0x86b287/0x8fc000, compress 0x0/0x0/0x0, omap 0xeb23, meta 0x1a214dd), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:41.556354+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 18784256 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:42.556516+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 18784256 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:43.556830+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 18767872 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:44.556998+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 631348 data_alloc: 218103808 data_used: 12849
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72466432 unmapped: 18767872 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.084028244s of 12.300132751s, submitted: 126
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:45.557127+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72531968 unmapped: 18702336 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 90 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b05c380
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:46.557287+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fc726000/0x0/0x4ffc00000, data 0x86e168/0x904000, compress 0x0/0x0/0x0, omap 0xf537, meta 0x2bc0ac9), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:47.557457+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:48.557685+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:49.557800+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 638855 data_alloc: 218103808 data_used: 12849
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:50.557941+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72540160 unmapped: 18694144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 90 ms_handle_reset con 0x559f0b2db800 session 0x559f09cc3c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 90 heartbeat osd_stat(store_statfs(0x4fc726000/0x0/0x4ffc00000, data 0x86e168/0x904000, compress 0x0/0x0/0x0, omap 0xf537, meta 0x2bc0ac9), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:51.558080+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72630272 unmapped: 18604032 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 91 handle_osd_map epochs [91,92], i have 91, src has [1,92]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 92 ms_handle_reset con 0x559f08566c00 session 0x559f0af38000
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 92 ms_handle_reset con 0x559f08566400 session 0x559f090328c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 92 ms_handle_reset con 0x559f08567c00 session 0x559f0977e700
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:52.558205+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:53.558365+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:54.558547+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 646548 data_alloc: 218103808 data_used: 12865
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:55.558698+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 92 heartbeat osd_stat(store_statfs(0x4fc71e000/0x0/0x4ffc00000, data 0x870d56/0x90a000, compress 0x0/0x0/0x0, omap 0xfe38, meta 0x2bc01c8), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:56.558885+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:57.559076+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:58.559297+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:59.559485+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 646548 data_alloc: 218103808 data_used: 12865
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:00.559653+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72728576 unmapped: 18505728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 92 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b09f880
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.700133324s of 15.770095825s, submitted: 49
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:01.559846+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 93 heartbeat osd_stat(store_statfs(0x4fc71e000/0x0/0x4ffc00000, data 0x870d56/0x90a000, compress 0x0/0x0/0x0, omap 0xfe38, meta 0x2bc01c8), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 93 ms_handle_reset con 0x559f0b2db400 session 0x559f0b05d180
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 18374656 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:02.560014+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72859648 unmapped: 18374656 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 93 ms_handle_reset con 0x559f08566400 session 0x559f0b05ddc0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:03.560230+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 72892416 unmapped: 18341888 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 95 ms_handle_reset con 0x559f08566c00 session 0x559f0b05da40
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:04.560369+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 660485 data_alloc: 218103808 data_used: 12865
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 95 heartbeat osd_stat(store_statfs(0x4fc717000/0x0/0x4ffc00000, data 0x8737db/0x911000, compress 0x0/0x0/0x0, omap 0x10399, meta 0x2bbfc67), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 17973248 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:05.560523+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 17973248 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:06.560701+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 17973248 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:07.560857+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 95 heartbeat osd_stat(store_statfs(0x4fc6f1000/0x0/0x4ffc00000, data 0x898e2a/0x939000, compress 0x0/0x0/0x0, omap 0x10594, meta 0x2bbfa6c), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 17874944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 95 handle_osd_map epochs [95,96], i have 95, src has [1,96]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 96 ms_handle_reset con 0x559f0af1d800 session 0x559f09032fc0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:08.561063+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 17891328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 97 ms_handle_reset con 0x559f0af1d400 session 0x559f0af38700
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:09.562653+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 666817 data_alloc: 218103808 data_used: 19521
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 97 heartbeat osd_stat(store_statfs(0x4fc6ee000/0x0/0x4ffc00000, data 0x89a413/0x93c000, compress 0x0/0x0/0x0, omap 0x10a4b, meta 0x2bbf5b5), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74391552 unmapped: 16842752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0ae9fc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 97 ms_handle_reset con 0x559f0ae9fc00 session 0x559f0b05c540
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:10.562880+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 97 ms_handle_reset con 0x559f08566400 session 0x559f0b078700
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74555392 unmapped: 16678912 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.905013084s of 10.014015198s, submitted: 64
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 98 ms_handle_reset con 0x559f08566c00 session 0x559f099688c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:11.564265+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74588160 unmapped: 16646144 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 99 ms_handle_reset con 0x559f0af1d400 session 0x559f0aa6cc40
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:12.564711+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74604544 unmapped: 16629760 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f0af1d800 session 0x559f0977fc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:13.564981+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 16605184 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:14.565158+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 heartbeat osd_stat(store_statfs(0x4fc6e2000/0x0/0x4ffc00000, data 0x89fcaa/0x948000, compress 0x0/0x0/0x0, omap 0x11e47, meta 0x2bbe1b9), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 677227 data_alloc: 218103808 data_used: 19505
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 16605184 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:15.565400+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 16605184 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:16.565713+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74629120 unmapped: 16605184 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0ae9f000
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f0ae9f000 session 0x559f0b05d500
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08566c00 session 0x559f0b05c8c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08566400 session 0x559f0b0c9180
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f0af1d400 session 0x559f0b0d8a80
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f0af1d800 session 0x559f0af39500
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:17.565889+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08568400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08568400 session 0x559f09d6bc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08566400 session 0x559f0b079dc0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74825728 unmapped: 16408576 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:18.566019+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 heartbeat osd_stat(store_statfs(0x4fc6e2000/0x0/0x4ffc00000, data 0x89fcaa/0x948000, compress 0x0/0x0/0x0, omap 0x11f49, meta 0x2bbe0b7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74825728 unmapped: 16408576 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:19.566159+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 ms_handle_reset con 0x559f08566c00 session 0x559f0b0eb6c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 678372 data_alloc: 218103808 data_used: 20137
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 heartbeat osd_stat(store_statfs(0x4fc6e2000/0x0/0x4ffc00000, data 0x89fcaa/0x948000, compress 0x0/0x0/0x0, omap 0x11f49, meta 0x2bbe0b7), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74842112 unmapped: 16392192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0af1d800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:20.566370+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74842112 unmapped: 16392192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.915824890s of 10.005084038s, submitted: 54
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:21.566678+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74842112 unmapped: 16392192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:22.566862+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 74776576 unmapped: 16457728 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 101 ms_handle_reset con 0x559f0b2dbc00 session 0x559f08c38540
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:23.567123+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f0b2db400 session 0x559f090328c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2ea000
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2ea400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f0b2ea400 session 0x559f0977ea80
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f0b2ea000 session 0x559f0aa6ddc0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f08566400 session 0x559f0b05da40
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08566c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 16064512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:24.567257+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 102 ms_handle_reset con 0x559f08566c00 session 0x559f0b0c96c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2db400
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 689739 data_alloc: 218103808 data_used: 24268
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 103 heartbeat osd_stat(store_statfs(0x4fc6db000/0x0/0x4ffc00000, data 0x8a2782/0x94f000, compress 0x0/0x0/0x0, omap 0x12951, meta 0x2bbd6af), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 75210752 unmapped: 16023552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:25.567424+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2db400 session 0x559f0b083180
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 75218944 unmapped: 16015360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2dbc00 session 0x559f09cc21c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:26.567607+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2dbc00 session 0x559f0aa6d6c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2ea800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2ea800 session 0x559f0b0eaa80
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 75235328 unmapped: 15998976 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2eac00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 104 ms_handle_reset con 0x559f0b2eac00 session 0x559f0af38000
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:27.567766+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2eb000
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 14901248 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:28.567986+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 105 ms_handle_reset con 0x559f0b2eb000 session 0x559f0b082fc0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 14868480 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:29.568177+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 694003 data_alloc: 218103808 data_used: 24268
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 105 ms_handle_reset con 0x559f0af1d400 session 0x559f0b0c9340
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 105 ms_handle_reset con 0x559f0af1d800 session 0x559f0977f340
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 14868480 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 105 heartbeat osd_stat(store_statfs(0x4fc6d2000/0x0/0x4ffc00000, data 0x8a69c9/0x958000, compress 0x0/0x0/0x0, omap 0x1384e, meta 0x2bbc7b2), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2dbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:30.568316+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _renew_subs
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 106 ms_handle_reset con 0x559f0b2dbc00 session 0x559f09cc2000
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76374016 unmapped: 14860288 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:31.568719+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76414976 unmapped: 14819328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:32.568873+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.597695351s of 11.822847366s, submitted: 164
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76414976 unmapped: 14819328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:33.569111+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 107 heartbeat osd_stat(store_statfs(0x4fc6ce000/0x0/0x4ffc00000, data 0x8a948c/0x95c000, compress 0x0/0x0/0x0, omap 0x13f68, meta 0x2bbc098), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 107 ms_handle_reset con 0x559f08567c00 session 0x559f09032e00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 107 ms_handle_reset con 0x559f09dcbc00 session 0x559f0b078e00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76414976 unmapped: 14819328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f0b2ea800
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 107 ms_handle_reset con 0x559f0b2ea800 session 0x559f09033180
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:34.569228+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 695237 data_alloc: 218103808 data_used: 19252
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76324864 unmapped: 14909440 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f08567c00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:35.569404+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 107 heartbeat osd_stat(store_statfs(0x4fc6f4000/0x0/0x4ffc00000, data 0x885469/0x937000, compress 0x0/0x0/0x0, omap 0x13f68, meta 0x2bbc098), peers [0,1] op hist [0,0,0,1])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 107 ms_handle_reset con 0x559f08567c00 session 0x559f090328c0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: handle_auth_request added challenge on 0x559f09dcbc00
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76324864 unmapped: 14909440 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 108 ms_handle_reset con 0x559f09dcbc00 session 0x559f08c38700
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:36.569549+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:37.569769+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 108 heartbeat osd_stat(store_statfs(0x4fc6ef000/0x0/0x4ffc00000, data 0x886ad2/0x93a000, compress 0x0/0x0/0x0, omap 0x145c3, meta 0x2bbba3d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:38.569937+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 108 heartbeat osd_stat(store_statfs(0x4fc6ef000/0x0/0x4ffc00000, data 0x886ad2/0x93a000, compress 0x0/0x0/0x0, omap 0x145c3, meta 0x2bbba3d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:39.570237+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 698135 data_alloc: 218103808 data_used: 19252
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 108 heartbeat osd_stat(store_statfs(0x4fc6ef000/0x0/0x4ffc00000, data 0x886ad2/0x93a000, compress 0x0/0x0/0x0, omap 0x145c3, meta 0x2bbba3d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:40.570449+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:41.570743+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:42.571069+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:43.571436+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.643769264s of 10.903412819s, submitted: 68
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 109 heartbeat osd_stat(store_statfs(0x4fc6ef000/0x0/0x4ffc00000, data 0x886ad2/0x93a000, compress 0x0/0x0/0x0, omap 0x145c3, meta 0x2bbba3d), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:44.571742+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 700845 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:45.571923+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 109 heartbeat osd_stat(store_statfs(0x4fc6ed000/0x0/0x4ffc00000, data 0x887f9e/0x93d000, compress 0x0/0x0/0x0, omap 0x14858, meta 0x2bbb7a8), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:46.572079+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:47.572465+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:48.572764+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:49.573145+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:50.573271+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:51.573488+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:52.573621+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:53.573771+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:54.573884+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:55.574020+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:56.574170+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:57.574381+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:58.574521+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:59.574701+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:00.574909+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:01.575079+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:02.575248+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:03.575409+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:04.575550+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:05.575789+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:06.576050+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:07.576264+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:08.576432+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:09.576634+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:10.576813+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:11.576964+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:12.577144+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:13.577348+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:14.577768+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:15.577943+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:16.578102+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:17.578449+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:18.578621+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:19.578860+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:20.579031+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:21.579186+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:22.579406+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:23.579645+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:24.579797+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:25.579938+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:26.580096+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:27.580338+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:28.580672+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:29.580905+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:30.581123+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:31.581394+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:32.581642+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:33.581846+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:34.582012+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:35.582180+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:36.582370+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:37.582666+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:38.582883+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:39.583099+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:40.583376+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:41.583595+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:42.583808+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:43.584002+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:44.584195+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:45.584384+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:46.584553+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:47.584712+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:48.584875+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:49.585040+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread fragmentation_score=0.000147 took=0.000030s
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:50.585207+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:51.585425+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:52.585618+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:53.585779+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:54.585918+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:55.586079+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:56.586334+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:57.586547+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:58.586740+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:59.586894+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:00.587023+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:01.587135+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:02.587273+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 14893056 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:03.587411+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76480512 unmapped: 14753792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:04.587530+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'config show' '{prefix=config show}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77086720 unmapped: 14147584 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:05.587678+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76890112 unmapped: 14344192 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:06.587828+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:07.588003+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 76947456 unmapped: 14286848 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'perf dump' '{prefix=perf dump}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:08.588181+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'perf schema' '{prefix=perf schema}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:09.588337+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:10.588494+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:11.588611+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:12.588743+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:13.588902+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:14.589025+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:15.589190+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:16.589335+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:17.589610+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:18.589716+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:19.589845+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:20.590046+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:21.590171+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:22.590309+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:23.590472+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:24.590607+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:25.590760+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:26.590902+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:27.591030+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:28.591196+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:29.591315+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:30.591435+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:31.591621+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:32.591736+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:33.591890+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:34.592042+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:35.592186+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:36.592354+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:37.592490+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:38.592649+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:39.592819+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:40.592997+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:41.593152+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:42.593329+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:43.593541+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:44.593702+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:45.593839+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:46.593987+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:47.594124+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:48.594346+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:49.594493+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:50.594667+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:51.594858+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:52.594980+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:53.595110+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:54.595274+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:55.595407+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:56.595607+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:57.595789+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:58.595966+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:59.596260+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:00.596447+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:01.596651+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:02.596832+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:03.597058+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:04.597207+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:05.597353+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77307904 unmapped: 13926400 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:06.597502+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:07.597659+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:08.597925+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:09.598055+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:10.598199+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:11.598366+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:12.598524+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:13.598801+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:14.598987+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:15.599159+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:16.599304+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:17.599463+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:18.599662+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:19.599791+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:20.599910+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:21.600141+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:22.602991+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:23.605207+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:24.605981+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:25.606351+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:26.607489+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:27.608694+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77176832 unmapped: 14057472 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:28.609211+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:29.609621+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:30.609971+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:31.610213+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:32.610991+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:33.612063+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:34.612396+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:35.612662+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:36.612881+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:37.613125+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:38.613306+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:39.613503+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:40.613694+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:41.613951+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:42.614160+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:43.614398+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:44.614604+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:45.614779+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:46.614921+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:47.615096+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:48.615277+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77185024 unmapped: 14049280 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:49.615636+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:50.615878+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:51.616092+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:52.616263+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:53.616621+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:54.617049+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:55.617479+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1857107739' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:56.618347+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:57.619082+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:58.619417+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:59.619788+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:00.620087+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:01.620750+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:02.621030+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:03.621361+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:04.621673+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:05.621936+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77193216 unmapped: 14041088 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:06.622171+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:07.622643+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:08.623070+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:09.623358+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:10.623558+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:11.623779+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:12.623941+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:13.624207+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:14.624415+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:15.624635+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:16.624890+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:17.625120+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:18.625488+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:19.625719+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:20.625945+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77201408 unmapped: 14032896 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:21.626273+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:22.626546+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:23.626927+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:24.627127+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:25.627378+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:26.627666+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:27.627845+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:28.628133+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:29.628331+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:30.628495+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:31.628672+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:32.628856+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:33.629109+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:34.629231+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77209600 unmapped: 14024704 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:35.629388+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:36.629735+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:37.630075+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:38.630347+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:39.630542+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:40.630849+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:41.631034+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:42.631270+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:43.631709+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:44.631994+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77217792 unmapped: 14016512 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:45.632303+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:46.632622+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:47.632844+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:48.633072+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:49.633338+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:50.633559+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:51.633830+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:52.634109+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:53.634428+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:54.634744+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:55.635071+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:56.635298+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:57.635428+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:58.635681+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:59.635914+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77225984 unmapped: 14008320 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:00.636165+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:01.636404+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:02.636728+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:03.637004+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:04.637241+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:05.637522+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:06.637826+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:07.638047+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:08.638290+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:09.638474+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:10.638696+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:11.639002+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:12.639252+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:13.639616+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:14.639817+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:15.639901+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:16.640042+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:17.640184+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:18.640317+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:19.640481+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77234176 unmapped: 14000128 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:20.640614+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:21.640740+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:22.640901+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:23.641105+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:24.641316+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:25.641499+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:26.641665+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:27.641809+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:28.641973+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:29.642125+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:30.642323+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:31.643445+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77242368 unmapped: 13991936 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:32.644561+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:33.645637+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:34.646338+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:35.647052+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:36.647453+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:37.648095+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:38.648503+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:39.649052+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:40.649475+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:41.650005+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:42.650695+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:43.651184+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:44.651437+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:45.651787+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:46.652111+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:47.652452+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:48.652727+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:49.652938+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:50.653075+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:51.653240+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:52.653464+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:53.653643+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:54.653787+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77250560 unmapped: 13983744 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:55.654039+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:56.654295+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:57.654502+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:58.654710+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:59.654839+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:00.655021+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:01.655220+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:02.655420+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:03.655665+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:04.655886+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:05.656092+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:06.656333+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:07.656656+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:08.656869+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:09.657095+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:10.657266+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:11.657497+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:12.657729+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77258752 unmapped: 13975552 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:13.657973+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77266944 unmapped: 13967360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:14.658195+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77266944 unmapped: 13967360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:15.658444+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77266944 unmapped: 13967360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:16.658649+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77266944 unmapped: 13967360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:17.658902+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77266944 unmapped: 13967360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:18.659112+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77266944 unmapped: 13967360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:19.659270+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77266944 unmapped: 13967360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:20.659446+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77266944 unmapped: 13967360 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:21.659637+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77135872 unmapped: 14098432 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:22.659838+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77135872 unmapped: 14098432 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:23.660035+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77135872 unmapped: 14098432 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:24.660241+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77135872 unmapped: 14098432 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:25.660404+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77135872 unmapped: 14098432 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:26.660605+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:27.660846+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:28.660990+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:29.661120+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:30.661313+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:31.661538+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:32.661747+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:33.662004+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:34.662173+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:35.662367+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:36.662619+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:37.662826+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:38.662980+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77144064 unmapped: 14090240 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:39.663190+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:40.663390+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:41.663651+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:42.663864+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:43.664122+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:44.664289+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:45.664492+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:46.664682+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:47.664883+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:48.665043+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 5828 writes, 23K keys, 5828 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 5828 writes, 1121 syncs, 5.20 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1678 writes, 4312 keys, 1678 commit groups, 1.0 writes per commit group, ingest: 2.35 MB, 0.00 MB/s
                                           Interval WAL: 1678 writes, 755 syncs, 2.22 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:49.665216+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:50.665398+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77152256 unmapped: 14082048 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:51.665562+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77160448 unmapped: 14073856 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:52.665787+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77160448 unmapped: 14073856 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:53.666016+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77160448 unmapped: 14073856 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:54.666217+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77160448 unmapped: 14073856 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:55.666682+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: mgrc ms_handle_reset ms_handle_reset con 0x559f08df0000
Dec 03 21:43:43 compute-0 ceph-osd[88129]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1553116858
Dec 03 21:43:43 compute-0 ceph-osd[88129]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1553116858,v1:192.168.122.100:6801/1553116858]
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: get_auth_request con 0x559f0b2ea800 auth_method 0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: mgrc handle_mgr_configure stats_period=5
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:56.666863+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:57.667077+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:58.667261+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:59.667419+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:00.667647+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:01.667808+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:02.668011+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:03.668238+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77430784 unmapped: 13803520 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:04.668417+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77438976 unmapped: 13795328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:05.668628+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77438976 unmapped: 13795328 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:06.668825+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:07.668991+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:08.669150+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:09.669363+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:10.669487+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:11.669644+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:12.670373+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:13.670706+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:14.670887+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:15.671040+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:16.672009+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:17.672199+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:18.672382+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:19.672695+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:20.672859+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:21.673012+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77447168 unmapped: 13787136 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:22.673127+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:23.673283+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:24.673484+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:25.673644+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:26.673781+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:27.673950+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:28.674172+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:29.674362+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:30.674549+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:31.674749+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:32.674903+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:33.675069+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:34.675188+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:35.675326+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:36.675527+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:37.675670+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:38.675797+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:39.675945+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:40.676405+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:41.678916+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77455360 unmapped: 13778944 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:42.679539+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:43.680082+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:44.680666+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:45.680964+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:46.681156+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:47.681720+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:48.682086+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:49.682762+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:50.682961+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:51.683172+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:52.683467+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:53.683715+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:54.683881+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:55.684015+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:56.684160+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:57.684307+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:58.684449+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:59.684695+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:00.684913+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:01.685120+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:02.685327+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:03.685704+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:04.685899+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:05.686153+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:06.686315+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:07.686450+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:08.686654+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:09.686848+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:10.687047+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:11.687241+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:12.687460+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:13.687706+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:14.687858+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:15.687980+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:16.688146+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:17.688339+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:18.688506+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:19.688636+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:20.688815+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:21.689000+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:22.689194+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:23.689443+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:24.689669+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:25.689867+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:26.690067+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:27.690286+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:28.690461+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:29.690702+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:30.690867+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:31.690998+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:32.691108+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:33.691344+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:34.691492+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:35.691676+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:36.691832+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:37.692017+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:38.692179+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:39.692387+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:40.692550+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:41.692716+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:42.692893+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:43.693106+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:44.693267+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:45.693400+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:46.693538+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:47.693698+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:48.693848+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:49.694068+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:50.694286+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:51.694478+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:52.694643+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:53.694843+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:54.695086+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:55.695318+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:56.695550+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:57.695808+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:58.696030+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:59.696293+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:00.696532+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:01.696857+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:02.697131+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:03.697454+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77463552 unmapped: 13770752 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:04.697740+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:05.698022+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:06.698265+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:07.698713+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:08.699043+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:09.699260+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:10.699455+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:11.699725+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:12.700036+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:13.700323+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:14.700551+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:15.700883+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:16.701422+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15070 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:17.705691+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:18.707062+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:19.707989+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77471744 unmapped: 13762560 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:20.708686+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:21.709077+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:22.709298+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:23.709835+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:24.710370+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:25.710699+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:26.711098+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:27.711459+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:28.711779+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:29.712192+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:30.712516+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77479936 unmapped: 13754368 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:31.712826+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:32.713108+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:33.713393+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:34.713633+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:35.713813+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:36.714063+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:37.714423+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:38.714670+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:39.714940+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:40.715214+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:41.715550+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:42.715854+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:43.716160+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:44.716398+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:45.716665+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 13746176 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:46.716780+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:47.716911+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:48.717078+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:49.717908+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:50.718633+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:51.718904+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:52.719082+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:53.719320+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:54.719466+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:55.719638+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:56.719813+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:57.720204+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:58.720480+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:59.720691+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:00.723005+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:01.723215+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:02.723521+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:03.723806+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:04.724001+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:05.724267+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:06.724487+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:07.724752+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:08.725004+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:09.725279+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:10.725484+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:11.725682+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:12.725847+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:13.726086+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:14.726303+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:15.726640+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:16.726796+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:17.727056+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:18.727251+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:19.727468+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:20.727650+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:21.727946+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:22.728246+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:23.728543+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:24.728684+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:25.728955+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:26.729198+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:27.729409+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:28.729593+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:29.729771+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:30.730003+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:31.730227+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77496320 unmapped: 13737984 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:32.730477+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:33.731181+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:34.731443+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:35.731634+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:36.731806+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:37.732037+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:38.732188+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:39.732386+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:40.732550+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:41.732805+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:42.733045+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:43.733349+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 13729792 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:44.733507+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:45.733666+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:46.733908+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:47.734131+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:48.734280+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:49.734476+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:50.734679+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:51.734903+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:52.735130+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:53.735328+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:54.735490+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:55.735637+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:56.735786+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:57.735980+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:58.736140+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:59.736319+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:00.736458+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:01.737308+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:02.738059+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:03.738222+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:04.738389+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:05.738525+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:06.738657+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:07.738777+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:08.738910+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:09.739055+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77520896 unmapped: 13713408 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:10.739203+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'config show' '{prefix=config show}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 13721600 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:11.739333+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77324288 unmapped: 13910016 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:43 compute-0 ceph-osd[88129]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:43 compute-0 ceph-osd[88129]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 703619 data_alloc: 218103808 data_used: 23313
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: tick
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_tickets
Dec 03 21:43:43 compute-0 ceph-osd[88129]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:12.739456+0000)
Dec 03 21:43:43 compute-0 ceph-osd[88129]: osd.2 110 heartbeat osd_stat(store_statfs(0x4fc6ea000/0x0/0x4ffc00000, data 0x88944e/0x940000, compress 0x0/0x0/0x0, omap 0x14bb8, meta 0x2bbb448), peers [0,1] op hist [])
Dec 03 21:43:43 compute-0 ceph-osd[88129]: prioritycache tune_memory target: 4294967296 mapped: 77332480 unmapped: 13901824 heap: 91234304 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:43 compute-0 ceph-osd[88129]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:43:43 compute-0 ceph-mon[75204]: pgmap v1161: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:43 compute-0 ceph-mon[75204]: from='client.15062 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:43 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3284278167' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Dec 03 21:43:43 compute-0 ceph-mon[75204]: from='client.15066 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:43 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1857107739' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Dec 03 21:43:44 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:43:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 03 21:43:44 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/647628077' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 03 21:43:44 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15074 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:44 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1162: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:44 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15078 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:44 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 03 21:43:44 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1918744001' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 03 21:43:44 compute-0 ceph-mon[75204]: from='client.15070 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:44 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/647628077' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Dec 03 21:43:44 compute-0 ceph-mon[75204]: from='client.15074 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:44 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1918744001' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Dec 03 21:43:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:45 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15082 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 03 21:43:45 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1382647860' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 03 21:43:45 compute-0 crontab[264501]: (root) LIST (root)
Dec 03 21:43:45 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15086 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:45 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 03 21:43:45 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1263793158' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:43:45 compute-0 ceph-mon[75204]: pgmap v1162: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:45 compute-0 ceph-mon[75204]: from='client.15078 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:45 compute-0 ceph-mon[75204]: from='client.15082 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:45 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1382647860' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Dec 03 21:43:45 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1263793158' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Dec 03 21:43:46 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15088 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:46 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 03 21:43:46 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/828817084' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 03 21:43:46 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1163: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:46 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15092 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:47 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 03 21:43:47 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2530743118' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 03 21:43:47 compute-0 ceph-mon[75204]: from='client.15086 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:47 compute-0 ceph-mon[75204]: from='client.15088 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:47 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/828817084' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Dec 03 21:43:47 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15096 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:47 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15100 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:48 compute-0 ceph-mon[75204]: pgmap v1163: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:48 compute-0 ceph-mon[75204]: from='client.15092 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:48 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2530743118' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Dec 03 21:43:48 compute-0 ceph-mon[75204]: from='client.15096 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 03 21:43:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2129293945' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:36.566789+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66052096 unmapped: 1048576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:37.566907+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66052096 unmapped: 1048576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:38.567129+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66060288 unmapped: 1040384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:39.567336+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 468059 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66068480 unmapped: 1032192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:40.567482+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66076672 unmapped: 1024000 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 14.061655045s of 14.075113297s, submitted: 6
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:41.567722+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:11.499858+0000 osd.1 (osd.1) 68 : cluster [DBG] 2.1b scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:11.510427+0000 osd.1 (osd.1) 69 : cluster [DBG] 2.1b scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 69)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:11.499858+0000 osd.1 (osd.1) 68 : cluster [DBG] 2.1b scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:11.510427+0000 osd.1 (osd.1) 69 : cluster [DBG] 2.1b scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:42.567989+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:43.568117+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 1 last_log 70 sent 69 num 1 unsent 1 sending 1
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:13.562030+0000 osd.1 (osd.1) 70 : cluster [DBG] 5.11 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 70)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:13.562030+0000 osd.1 (osd.1) 70 : cluster [DBG] 5.11 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66093056 unmapped: 1007616 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:44.568301+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 1 last_log 71 sent 70 num 1 unsent 1 sending 1
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:13.572546+0000 osd.1 (osd.1) 71 : cluster [DBG] 5.11 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 71)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:13.572546+0000 osd.1 (osd.1) 71 : cluster [DBG] 5.11 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 472885 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66101248 unmapped: 999424 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:45.568522+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:15.554886+0000 osd.1 (osd.1) 72 : cluster [DBG] 4.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:15.565426+0000 osd.1 (osd.1) 73 : cluster [DBG] 4.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 73)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:15.554886+0000 osd.1 (osd.1) 72 : cluster [DBG] 4.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:15.565426+0000 osd.1 (osd.1) 73 : cluster [DBG] 4.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66101248 unmapped: 999424 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:46.568730+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:16.552950+0000 osd.1 (osd.1) 74 : cluster [DBG] 4.10 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:16.563498+0000 osd.1 (osd.1) 75 : cluster [DBG] 4.10 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 75)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:16.552950+0000 osd.1 (osd.1) 74 : cluster [DBG] 4.10 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:16.563498+0000 osd.1 (osd.1) 75 : cluster [DBG] 4.10 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 991232 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:47.568897+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 991232 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:48.569047+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66117632 unmapped: 983040 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:49.569205+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 477709 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:50.569342+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:51.569495+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66134016 unmapped: 966656 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:52.569610+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66134016 unmapped: 966656 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.014001846s of 12.029939651s, submitted: 8
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:53.569751+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:23.529817+0000 osd.1 (osd.1) 76 : cluster [DBG] 2.17 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:23.540346+0000 osd.1 (osd.1) 77 : cluster [DBG] 2.17 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 77)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:23.529817+0000 osd.1 (osd.1) 76 : cluster [DBG] 2.17 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:23.540346+0000 osd.1 (osd.1) 77 : cluster [DBG] 2.17 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:54.569962+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 480122 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:55.570081+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:56.570241+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:26.538444+0000 osd.1 (osd.1) 78 : cluster [DBG] 4.f scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:26.548858+0000 osd.1 (osd.1) 79 : cluster [DBG] 4.f scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66174976 unmapped: 925696 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 79)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:26.538444+0000 osd.1 (osd.1) 78 : cluster [DBG] 4.f scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:26.548858+0000 osd.1 (osd.1) 79 : cluster [DBG] 4.f scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:57.570418+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66183168 unmapped: 917504 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:58.570650+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66183168 unmapped: 917504 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:59.570760+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:29.547396+0000 osd.1 (osd.1) 80 : cluster [DBG] 2.15 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:29.557980+0000 osd.1 (osd.1) 81 : cluster [DBG] 2.15 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 484946 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 909312 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 81)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:29.547396+0000 osd.1 (osd.1) 80 : cluster [DBG] 2.15 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:29.557980+0000 osd.1 (osd.1) 81 : cluster [DBG] 2.15 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:00.570919+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 909312 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:01.571089+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66199552 unmapped: 901120 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:02.571261+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:32.532141+0000 osd.1 (osd.1) 82 : cluster [DBG] 5.13 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:32.542720+0000 osd.1 (osd.1) 83 : cluster [DBG] 5.13 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66207744 unmapped: 892928 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 83)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:32.532141+0000 osd.1 (osd.1) 82 : cluster [DBG] 5.13 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:32.542720+0000 osd.1 (osd.1) 83 : cluster [DBG] 5.13 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:03.571557+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:04.571759+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 487359 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.013473511s of 12.029477119s, submitted: 8
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:05.571907+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:35.559264+0000 osd.1 (osd.1) 84 : cluster [DBG] 5.12 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:35.569835+0000 osd.1 (osd.1) 85 : cluster [DBG] 5.12 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 85)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:35.559264+0000 osd.1 (osd.1) 84 : cluster [DBG] 5.12 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:35.569835+0000 osd.1 (osd.1) 85 : cluster [DBG] 5.12 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:06.572110+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66224128 unmapped: 876544 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:07.572335+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66232320 unmapped: 868352 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:08.572535+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66240512 unmapped: 860160 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:09.572702+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:39.545417+0000 osd.1 (osd.1) 86 : cluster [DBG] 5.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:39.555959+0000 osd.1 (osd.1) 87 : cluster [DBG] 5.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 492183 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66248704 unmapped: 851968 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 87)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:39.545417+0000 osd.1 (osd.1) 86 : cluster [DBG] 5.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:39.555959+0000 osd.1 (osd.1) 87 : cluster [DBG] 5.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:10.572910+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:40.535970+0000 osd.1 (osd.1) 88 : cluster [DBG] 5.16 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:40.546521+0000 osd.1 (osd.1) 89 : cluster [DBG] 5.16 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66256896 unmapped: 843776 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 89)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:40.535970+0000 osd.1 (osd.1) 88 : cluster [DBG] 5.16 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:40.546521+0000 osd.1 (osd.1) 89 : cluster [DBG] 5.16 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:11.573129+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66265088 unmapped: 835584 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:12.573291+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66265088 unmapped: 835584 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:13.573436+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:43.429237+0000 osd.1 (osd.1) 90 : cluster [DBG] 2.d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:43.439861+0000 osd.1 (osd.1) 91 : cluster [DBG] 2.d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 91)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:43.429237+0000 osd.1 (osd.1) 90 : cluster [DBG] 2.d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:43.439861+0000 osd.1 (osd.1) 91 : cluster [DBG] 2.d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:14.573686+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 497007 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:15.573866+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:16.573980+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66297856 unmapped: 802816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.861497879s of 11.878818512s, submitted: 8
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:17.574161+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:47.438326+0000 osd.1 (osd.1) 92 : cluster [DBG] 4.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:47.448771+0000 osd.1 (osd.1) 93 : cluster [DBG] 4.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66297856 unmapped: 802816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 93)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:47.438326+0000 osd.1 (osd.1) 92 : cluster [DBG] 4.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:47.448771+0000 osd.1 (osd.1) 93 : cluster [DBG] 4.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:18.574396+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66306048 unmapped: 794624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:19.574606+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:49.419861+0000 osd.1 (osd.1) 94 : cluster [DBG] 2.3 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:49.430311+0000 osd.1 (osd.1) 95 : cluster [DBG] 2.3 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 501829 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66306048 unmapped: 794624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 95)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:49.419861+0000 osd.1 (osd.1) 94 : cluster [DBG] 2.3 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:49.430311+0000 osd.1 (osd.1) 95 : cluster [DBG] 2.3 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:20.574830+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66306048 unmapped: 794624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:21.574983+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66314240 unmapped: 786432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:22.575201+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66314240 unmapped: 786432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:23.575341+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66330624 unmapped: 770048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:24.575508+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 501829 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66338816 unmapped: 761856 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:25.575693+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:55.425956+0000 osd.1 (osd.1) 96 : cluster [DBG] 4.5 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:55.436538+0000 osd.1 (osd.1) 97 : cluster [DBG] 4.5 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 97)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:55.425956+0000 osd.1 (osd.1) 96 : cluster [DBG] 4.5 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:55.436538+0000 osd.1 (osd.1) 97 : cluster [DBG] 4.5 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66347008 unmapped: 753664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:26.575965+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66347008 unmapped: 753664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:27.576174+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66347008 unmapped: 753664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:28.576452+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.a scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.959852219s of 11.973832130s, submitted: 6
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.a scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:29.576670+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:59.412221+0000 osd.1 (osd.1) 98 : cluster [DBG] 2.a scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:12:59.422160+0000 osd.1 (osd.1) 99 : cluster [DBG] 2.a scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 99)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:59.412221+0000 osd.1 (osd.1) 98 : cluster [DBG] 2.a scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:12:59.422160+0000 osd.1 (osd.1) 99 : cluster [DBG] 2.a scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 506651 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:30.576966+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:31.577112+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:32.577255+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:33.577385+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66404352 unmapped: 696320 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:34.577506+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 506651 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66412544 unmapped: 688128 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:35.577692+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66420736 unmapped: 679936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:36.577892+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66420736 unmapped: 679936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:37.578176+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66428928 unmapped: 671744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:38.578500+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66437120 unmapped: 663552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:39.578784+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 506651 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66437120 unmapped: 663552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:40.578977+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.087844849s of 12.090794563s, submitted: 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:41.579234+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:11.503004+0000 osd.1 (osd.1) 100 : cluster [DBG] 4.7 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:11.513501+0000 osd.1 (osd.1) 101 : cluster [DBG] 4.7 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 101)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:11.503004+0000 osd.1 (osd.1) 100 : cluster [DBG] 4.7 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:11.513501+0000 osd.1 (osd.1) 101 : cluster [DBG] 4.7 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:42.579503+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:43.579698+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:44.579842+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 509062 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:45.580007+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66469888 unmapped: 630784 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:46.580113+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:16.522398+0000 osd.1 (osd.1) 102 : cluster [DBG] 2.5 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:16.532978+0000 osd.1 (osd.1) 103 : cluster [DBG] 2.5 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 103)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:16.522398+0000 osd.1 (osd.1) 102 : cluster [DBG] 2.5 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:16.532978+0000 osd.1 (osd.1) 103 : cluster [DBG] 2.5 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:47.580344+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66502656 unmapped: 598016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:48.580614+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:18.457633+0000 osd.1 (osd.1) 104 : cluster [DBG] 2.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:18.468178+0000 osd.1 (osd.1) 105 : cluster [DBG] 2.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 105)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:18.457633+0000 osd.1 (osd.1) 104 : cluster [DBG] 2.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:18.468178+0000 osd.1 (osd.1) 105 : cluster [DBG] 2.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66510848 unmapped: 589824 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:49.580815+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:19.466828+0000 osd.1 (osd.1) 106 : cluster [DBG] 2.7 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:19.477343+0000 osd.1 (osd.1) 107 : cluster [DBG] 2.7 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 107)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:19.466828+0000 osd.1 (osd.1) 106 : cluster [DBG] 2.7 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:19.477343+0000 osd.1 (osd.1) 107 : cluster [DBG] 2.7 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 518706 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 581632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:50.580998+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:20.494924+0000 osd.1 (osd.1) 108 : cluster [DBG] 5.1 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:20.505515+0000 osd.1 (osd.1) 109 : cluster [DBG] 5.1 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 109)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:20.494924+0000 osd.1 (osd.1) 108 : cluster [DBG] 5.1 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:20.505515+0000 osd.1 (osd.1) 109 : cluster [DBG] 5.1 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66527232 unmapped: 573440 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:51.581155+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.991565704s of 11.008629799s, submitted: 10
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:52.581279+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:22.511695+0000 osd.1 (osd.1) 110 : cluster [DBG] 2.6 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:22.522187+0000 osd.1 (osd.1) 111 : cluster [DBG] 2.6 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66535424 unmapped: 565248 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 111)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:22.511695+0000 osd.1 (osd.1) 110 : cluster [DBG] 2.6 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:22.522187+0000 osd.1 (osd.1) 111 : cluster [DBG] 2.6 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.f scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.f scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:53.581460+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:23.523664+0000 osd.1 (osd.1) 112 : cluster [DBG] 5.f scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:23.534195+0000 osd.1 (osd.1) 113 : cluster [DBG] 5.f scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 113)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:23.523664+0000 osd.1 (osd.1) 112 : cluster [DBG] 5.f scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:23.534195+0000 osd.1 (osd.1) 113 : cluster [DBG] 5.f scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:54.581671+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 523528 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:55.581803+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66551808 unmapped: 548864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:56.581965+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:57.582189+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:58.582354+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 115 sent 113 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:28.530907+0000 osd.1 (osd.1) 114 : cluster [DBG] 5.c scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:28.541418+0000 osd.1 (osd.1) 115 : cluster [DBG] 5.c scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66576384 unmapped: 524288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:59.582530+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 4 last_log 117 sent 115 num 4 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:29.520775+0000 osd.1 (osd.1) 116 : cluster [DBG] 5.1d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:29.531381+0000 osd.1 (osd.1) 117 : cluster [DBG] 5.1d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66576384 unmapped: 524288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 115)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:28.530907+0000 osd.1 (osd.1) 114 : cluster [DBG] 5.c scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:28.541418+0000 osd.1 (osd.1) 115 : cluster [DBG] 5.c scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 528352 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:00.582778+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66576384 unmapped: 524288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 117)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:29.520775+0000 osd.1 (osd.1) 116 : cluster [DBG] 5.1d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:29.531381+0000 osd.1 (osd.1) 117 : cluster [DBG] 5.1d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:01.582931+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 119 sent 117 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:31.514449+0000 osd.1 (osd.1) 118 : cluster [DBG] 5.19 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:31.524985+0000 osd.1 (osd.1) 119 : cluster [DBG] 5.19 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 119)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:31.514449+0000 osd.1 (osd.1) 118 : cluster [DBG] 5.19 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:31.524985+0000 osd.1 (osd.1) 119 : cluster [DBG] 5.19 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:02.583175+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.000996590s of 11.020680428s, submitted: 10
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:03.583342+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 121 sent 119 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:33.532373+0000 osd.1 (osd.1) 120 : cluster [DBG] 2.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:33.542789+0000 osd.1 (osd.1) 121 : cluster [DBG] 2.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 121)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:33.532373+0000 osd.1 (osd.1) 120 : cluster [DBG] 2.9 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:33.542789+0000 osd.1 (osd.1) 121 : cluster [DBG] 2.9 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:04.583620+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 535589 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:05.583760+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 123 sent 121 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:35.487393+0000 osd.1 (osd.1) 122 : cluster [DBG] 5.1a scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:35.498089+0000 osd.1 (osd.1) 123 : cluster [DBG] 5.1a scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 123)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:35.487393+0000 osd.1 (osd.1) 122 : cluster [DBG] 5.1a scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:35.498089+0000 osd.1 (osd.1) 123 : cluster [DBG] 5.1a scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:06.583993+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:07.584133+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:08.584430+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 125 sent 123 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:38.562295+0000 osd.1 (osd.1) 124 : cluster [DBG] 5.18 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:38.572881+0000 osd.1 (osd.1) 125 : cluster [DBG] 5.18 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 125)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:38.562295+0000 osd.1 (osd.1) 124 : cluster [DBG] 5.18 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:38.572881+0000 osd.1 (osd.1) 125 : cluster [DBG] 5.18 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:09.584680+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:10.584943+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 127 sent 125 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:40.547425+0000 osd.1 (osd.1) 126 : cluster [DBG] 6.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:40.572136+0000 osd.1 (osd.1) 127 : cluster [DBG] 6.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 540413 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 127)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:40.547425+0000 osd.1 (osd.1) 126 : cluster [DBG] 6.4 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:40.572136+0000 osd.1 (osd.1) 127 : cluster [DBG] 6.4 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:11.585164+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:12.585283+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 129 sent 127 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:42.465943+0000 osd.1 (osd.1) 128 : cluster [DBG] 6.b scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:42.480111+0000 osd.1 (osd.1) 129 : cluster [DBG] 6.b scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 129)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:42.465943+0000 osd.1 (osd.1) 128 : cluster [DBG] 6.b scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:42.480111+0000 osd.1 (osd.1) 129 : cluster [DBG] 6.b scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:13.585488+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 131 sent 129 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:43.467445+0000 osd.1 (osd.1) 130 : cluster [DBG] 6.1 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:43.477985+0000 osd.1 (osd.1) 131 : cluster [DBG] 6.1 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.957039833s of 10.982520103s, submitted: 12
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 131)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:43.467445+0000 osd.1 (osd.1) 130 : cluster [DBG] 6.1 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:43.477985+0000 osd.1 (osd.1) 131 : cluster [DBG] 6.1 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:14.585787+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 133 sent 131 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:44.514975+0000 osd.1 (osd.1) 132 : cluster [DBG] 6.6 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:44.529090+0000 osd.1 (osd.1) 133 : cluster [DBG] 6.6 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 133)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:44.514975+0000 osd.1 (osd.1) 132 : cluster [DBG] 6.6 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:44.529090+0000 osd.1 (osd.1) 133 : cluster [DBG] 6.6 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:15.585978+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 135 sent 133 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:45.562112+0000 osd.1 (osd.1) 134 : cluster [DBG] 6.2 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:45.572685+0000 osd.1 (osd.1) 135 : cluster [DBG] 6.2 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 550057 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 135)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:45.562112+0000 osd.1 (osd.1) 134 : cluster [DBG] 6.2 scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:45.572685+0000 osd.1 (osd.1) 135 : cluster [DBG] 6.2 scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:16.586159+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 137 sent 135 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:46.554453+0000 osd.1 (osd.1) 136 : cluster [DBG] 6.d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:46.572068+0000 osd.1 (osd.1) 137 : cluster [DBG] 6.d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 137)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:46.554453+0000 osd.1 (osd.1) 136 : cluster [DBG] 6.d scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:46.572068+0000 osd.1 (osd.1) 137 : cluster [DBG] 6.d scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:17.586370+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 139 sent 137 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:47.542428+0000 osd.1 (osd.1) 138 : cluster [DBG] 6.c scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:47.556620+0000 osd.1 (osd.1) 139 : cluster [DBG] 6.c scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 139)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:47.542428+0000 osd.1 (osd.1) 138 : cluster [DBG] 6.c scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:47.556620+0000 osd.1 (osd.1) 139 : cluster [DBG] 6.c scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:18.586645+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:19.586800+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:20.586962+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 554879 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:21.587096+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  log_queue is 2 last_log 141 sent 139 num 2 unsent 2 sending 2
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:51.549371+0000 osd.1 (osd.1) 140 : cluster [DBG] 6.e scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  will send 2025-12-03T21:13:51.563550+0000 osd.1 (osd.1) 141 : cluster [DBG] 6.e scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client handle_log_ack log(last 141)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:51.549371+0000 osd.1 (osd.1) 140 : cluster [DBG] 6.e scrub starts
Dec 03 21:43:48 compute-0 ceph-osd[87094]: log_client  logged 2025-12-03T21:13:51.563550+0000 osd.1 (osd.1) 141 : cluster [DBG] 6.e scrub ok
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:22.587311+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:23.587510+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:24.587732+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66666496 unmapped: 434176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:25.587931+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66666496 unmapped: 434176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:26.588138+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:27.588293+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:28.588469+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:29.588701+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:30.588874+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:31.589026+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:32.589183+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:33.589369+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:34.589488+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:35.589636+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:36.589767+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:37.589964+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:38.590168+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:39.590319+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:40.590455+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:41.590650+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:42.590934+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:43.591114+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:44.591289+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:45.591505+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:46.591657+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:47.591783+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:48.591966+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:49.592096+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66740224 unmapped: 360448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:50.592213+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66740224 unmapped: 360448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:51.592353+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66748416 unmapped: 352256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:52.592494+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:53.592693+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:54.592858+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:55.592986+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:56.593146+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:57.593314+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:58.593497+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:59.593645+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:00.593813+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66797568 unmapped: 303104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:01.593958+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66797568 unmapped: 303104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:02.594145+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66805760 unmapped: 294912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:03.594302+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66813952 unmapped: 286720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:04.594452+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66813952 unmapped: 286720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:05.594584+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:06.594757+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:07.594898+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:08.595098+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:09.595284+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:10.595484+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:11.595676+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:12.595835+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:13.596053+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:14.596306+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:15.596519+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:16.596668+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:17.596848+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15104 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:48 compute-0 ceph-mgr[75500]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 03 21:43:48 compute-0 ceph-c21de27e-a7fd-594b-8324-0697ba9aab3a-mgr-compute-0-jxauqt[75496]: 2025-12-03T21:43:48.456+0000 7fa8c63a3640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:18.597029+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:19.597197+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:20.597369+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:21.597526+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:22.597634+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:23.597784+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:24.597955+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:25.598067+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:26.598203+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66904064 unmapped: 196608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:27.598394+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66904064 unmapped: 196608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:28.598640+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66904064 unmapped: 196608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:29.598826+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:30.599058+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:31.599451+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:32.599631+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66920448 unmapped: 180224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:33.599819+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66920448 unmapped: 180224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:34.599973+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:35.600175+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66945024 unmapped: 155648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:36.600345+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66945024 unmapped: 155648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:37.600528+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:38.600738+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:39.600901+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:40.601080+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:41.601213+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:42.601376+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:43.601532+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:44.601691+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:45.601829+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 122880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:46.601997+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 122880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:47.602113+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:48.602363+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:49.602522+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:50.602744+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:51.602947+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:52.603140+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:53.603300+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67002368 unmapped: 98304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:54.603481+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67002368 unmapped: 98304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:55.603640+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:56.603809+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:57.603957+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67043328 unmapped: 57344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:58.604162+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67043328 unmapped: 57344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:59.604340+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67043328 unmapped: 57344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:00.604466+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67051520 unmapped: 49152 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:01.604709+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67051520 unmapped: 49152 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:02.604879+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67051520 unmapped: 49152 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:03.605051+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67059712 unmapped: 40960 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:04.605207+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67059712 unmapped: 40960 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:05.605372+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 32768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:06.605503+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 32768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:07.605663+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67067904 unmapped: 32768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:08.605860+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67076096 unmapped: 24576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:09.606037+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67076096 unmapped: 24576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:10.606180+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67084288 unmapped: 16384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:11.606385+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67084288 unmapped: 16384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:12.606536+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 8192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:13.606671+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 8192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:14.606869+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67092480 unmapped: 8192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:15.606996+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:16.607138+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:17.607300+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:18.607492+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:19.607655+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:20.607813+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:21.607961+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:22.608095+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:23.608237+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67125248 unmapped: 1024000 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:24.608388+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67125248 unmapped: 1024000 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:25.608549+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67125248 unmapped: 1024000 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:26.608687+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 1015808 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:27.608799+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 1015808 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:28.609003+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67141632 unmapped: 1007616 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:29.609139+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67141632 unmapped: 1007616 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:30.609303+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67149824 unmapped: 999424 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:31.609457+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:32.609642+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:33.609827+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:34.610007+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:35.610217+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:36.610363+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:37.610514+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:38.610736+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:39.610861+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:40.611069+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:41.611198+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:42.611319+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:43.611500+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:44.611619+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:45.611820+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:46.612010+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:47.612146+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:48.612438+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:49.612657+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:50.612846+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:51.613007+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67223552 unmapped: 925696 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:52.613136+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67223552 unmapped: 925696 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:53.613341+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:54.613468+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:55.613614+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:56.613732+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67239936 unmapped: 909312 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:57.613869+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67239936 unmapped: 909312 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:58.614054+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:59.614230+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:00.614398+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:01.614560+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67256320 unmapped: 892928 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:02.614745+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67256320 unmapped: 892928 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:03.614865+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:04.615001+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:05.615155+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:06.615277+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:07.615438+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:08.615645+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:09.615799+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:10.615932+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:11.616062+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67289088 unmapped: 860160 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:12.616191+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67289088 unmapped: 860160 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:13.616416+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:14.616645+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:15.616814+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:16.616944+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:17.617185+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:18.617402+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:19.617528+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 835584 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:20.617623+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 835584 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:21.617762+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:22.617894+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:23.618038+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:24.618159+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67330048 unmapped: 819200 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:25.618308+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67330048 unmapped: 819200 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:26.618465+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67338240 unmapped: 811008 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:27.618661+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67338240 unmapped: 811008 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:28.618819+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67346432 unmapped: 802816 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:29.618941+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67346432 unmapped: 802816 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:30.619171+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67346432 unmapped: 802816 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:31.619364+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 794624 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:32.619549+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 794624 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:33.619710+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:34.619925+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:35.620052+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:36.620180+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:37.620327+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:38.620545+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:39.620749+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:40.620961+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:41.621202+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:42.621436+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67387392 unmapped: 761856 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:43.621632+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67387392 unmapped: 761856 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:44.621800+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:45.621989+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:46.622172+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67403776 unmapped: 745472 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:47.622353+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67403776 unmapped: 745472 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:48.622651+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:49.622878+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:50.622996+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:51.623144+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:52.623403+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:53.623590+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:54.623850+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:55.624049+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:56.624339+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:57.624525+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:58.624828+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:59.625080+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:00.625320+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:01.625493+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:02.625787+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:03.625972+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:04.626168+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:05.626397+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:06.626613+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:07.626865+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67469312 unmapped: 679936 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:08.627062+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:09.627208+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67469312 unmapped: 679936 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:10.627383+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67477504 unmapped: 671744 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:11.627583+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67477504 unmapped: 671744 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:12.627932+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:13.628255+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:14.628509+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:15.628732+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:16.628994+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:17.629213+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:18.629533+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:19.629899+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:20.630155+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67510272 unmapped: 638976 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:21.630729+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67510272 unmapped: 638976 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:22.630897+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67510272 unmapped: 638976 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:23.631190+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67518464 unmapped: 630784 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:24.631417+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67518464 unmapped: 630784 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:25.631645+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67518464 unmapped: 630784 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:26.631847+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 622592 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:27.632062+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67526656 unmapped: 622592 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:28.632333+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67534848 unmapped: 614400 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:29.632489+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67534848 unmapped: 614400 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:30.632702+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 598016 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:31.632835+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 598016 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:32.632973+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 598016 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:33.633155+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67551232 unmapped: 598016 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:34.633290+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67559424 unmapped: 589824 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:35.633457+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67559424 unmapped: 589824 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:36.633648+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67567616 unmapped: 581632 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:37.633849+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67567616 unmapped: 581632 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:38.634085+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67575808 unmapped: 573440 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:39.634234+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67575808 unmapped: 573440 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:40.634386+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67584000 unmapped: 565248 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:41.634608+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67584000 unmapped: 565248 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:42.634776+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67584000 unmapped: 565248 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:43.634997+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67592192 unmapped: 557056 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:44.635201+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67592192 unmapped: 557056 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:45.635359+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67600384 unmapped: 548864 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:46.635643+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67600384 unmapped: 548864 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:47.635844+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67600384 unmapped: 548864 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:48.636025+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67608576 unmapped: 540672 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:49.636158+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67608576 unmapped: 540672 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:50.636333+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67608576 unmapped: 540672 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:51.636710+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67616768 unmapped: 532480 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:52.636858+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67608576 unmapped: 540672 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:53.636987+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67616768 unmapped: 532480 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:54.637117+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67616768 unmapped: 532480 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:55.637487+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67616768 unmapped: 532480 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:56.637791+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67624960 unmapped: 524288 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:57.638071+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67624960 unmapped: 524288 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:58.638450+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67633152 unmapped: 516096 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:59.638685+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67633152 unmapped: 516096 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:00.638869+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 507904 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:01.638990+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 507904 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:02.639150+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67641344 unmapped: 507904 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:03.639328+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67649536 unmapped: 499712 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:04.639467+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67649536 unmapped: 499712 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:05.639681+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67649536 unmapped: 499712 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:06.639891+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67657728 unmapped: 491520 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:07.640152+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67657728 unmapped: 491520 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:08.640450+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67665920 unmapped: 483328 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:09.640664+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67665920 unmapped: 483328 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:10.640942+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 475136 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:11.641179+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 475136 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:12.641372+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67674112 unmapped: 475136 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:13.641520+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67682304 unmapped: 466944 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:14.641812+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67682304 unmapped: 466944 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:15.642059+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 458752 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:16.643149+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 458752 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:17.643295+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67690496 unmapped: 458752 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:18.643478+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67698688 unmapped: 450560 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:19.643642+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67698688 unmapped: 450560 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:20.643801+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67698688 unmapped: 450560 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:21.644014+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67706880 unmapped: 442368 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:22.644222+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67706880 unmapped: 442368 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:23.644454+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67715072 unmapped: 434176 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:24.644708+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67715072 unmapped: 434176 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:25.644888+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67715072 unmapped: 434176 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:26.645084+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67723264 unmapped: 425984 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:27.645201+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67723264 unmapped: 425984 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:28.645415+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67723264 unmapped: 425984 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:29.645607+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67731456 unmapped: 417792 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:30.645760+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67731456 unmapped: 417792 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:31.645928+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67739648 unmapped: 409600 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:32.646138+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67739648 unmapped: 409600 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:33.646270+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67739648 unmapped: 409600 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:34.646492+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67747840 unmapped: 401408 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:35.646737+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67747840 unmapped: 401408 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:36.646981+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 393216 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:37.647283+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 393216 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:38.647647+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67756032 unmapped: 393216 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:39.647940+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67764224 unmapped: 385024 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:40.648189+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67764224 unmapped: 385024 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:41.648399+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67772416 unmapped: 376832 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:42.648676+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67772416 unmapped: 376832 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:43.648912+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67780608 unmapped: 368640 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 16.57 MB, 0.03 MB/s
                                           Interval WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 2.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:44.649216+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:45.649450+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67870720 unmapped: 278528 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:46.649793+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67878912 unmapped: 270336 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:47.650054+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67878912 unmapped: 270336 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:48.650259+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67878912 unmapped: 270336 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:49.650430+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67887104 unmapped: 262144 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:50.650727+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67887104 unmapped: 262144 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:51.650884+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 253952 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:52.651059+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 253952 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:53.651271+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 253952 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:54.651468+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67903488 unmapped: 245760 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:55.651662+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67903488 unmapped: 245760 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:56.651942+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 237568 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:57.652185+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 237568 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:58.652376+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 237568 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:59.652652+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 229376 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:00.652914+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67919872 unmapped: 229376 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:01.653097+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67928064 unmapped: 221184 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:02.653278+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67928064 unmapped: 221184 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:03.653490+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67928064 unmapped: 221184 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:04.653739+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 212992 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:05.653926+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67936256 unmapped: 212992 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:06.654177+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 204800 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:07.654471+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 204800 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:08.655194+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67944448 unmapped: 204800 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:09.655463+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 196608 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:10.655769+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 196608 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:11.656059+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67952640 unmapped: 196608 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:12.656294+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 188416 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:13.656607+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67960832 unmapped: 188416 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:14.656846+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 180224 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:15.657081+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 180224 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:16.657346+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67969024 unmapped: 180224 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:17.657537+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 172032 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:18.657783+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67977216 unmapped: 172032 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:19.657990+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 163840 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:20.658196+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 163840 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:21.658347+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67985408 unmapped: 163840 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:22.658487+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 155648 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:23.658663+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 155648 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:24.658865+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 147456 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:25.659060+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68001792 unmapped: 147456 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:26.659264+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68009984 unmapped: 139264 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:27.659409+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68009984 unmapped: 139264 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:28.659660+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 131072 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:29.659836+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 131072 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:30.660001+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68018176 unmapped: 131072 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:31.660196+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 122880 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:32.660332+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 122880 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:33.660487+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68026368 unmapped: 122880 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:34.660650+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 114688 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:35.660841+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 114688 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:36.660964+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68034560 unmapped: 114688 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:37.661104+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 106496 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:38.661315+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 106496 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:39.661476+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68042752 unmapped: 106496 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:40.661611+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 98304 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:41.661796+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68050944 unmapped: 98304 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:42.661957+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 90112 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:43.662089+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68059136 unmapped: 90112 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:44.662238+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 81920 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:45.662394+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 81920 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:46.662610+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68067328 unmapped: 81920 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:47.662745+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 73728 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:48.662927+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68075520 unmapped: 73728 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:49.663155+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 65536 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:50.663336+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 65536 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:51.663617+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68083712 unmapped: 65536 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:52.663792+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68091904 unmapped: 57344 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:53.664030+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68091904 unmapped: 57344 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:54.664239+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68100096 unmapped: 49152 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:55.664385+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68100096 unmapped: 49152 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:56.664643+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68100096 unmapped: 49152 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:57.664803+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68108288 unmapped: 40960 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:58.665014+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68108288 unmapped: 40960 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:59.665170+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 32768 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:00.665373+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 32768 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:01.665514+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68116480 unmapped: 32768 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:02.665681+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 24576 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:03.665869+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 24576 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:04.666023+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68124672 unmapped: 24576 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:05.666184+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 16384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:06.666326+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 16384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:07.666477+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68132864 unmapped: 16384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:08.666665+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 8192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:09.666806+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68141056 unmapped: 8192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:10.666957+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68149248 unmapped: 0 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:11.667107+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68149248 unmapped: 0 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:12.667274+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 1040384 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:13.667402+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 1040384 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:14.667603+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68157440 unmapped: 1040384 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:15.667754+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68165632 unmapped: 1032192 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:16.667910+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68165632 unmapped: 1032192 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:17.668070+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 1024000 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:18.668260+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 1024000 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:19.668419+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 1024000 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:20.668602+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 1015808 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:21.669400+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 1015808 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:22.669654+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68182016 unmapped: 1015808 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:23.669892+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 1007616 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:24.669985+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 1007616 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:25.670260+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 999424 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:26.670420+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 999424 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:27.670603+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 999424 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:28.670792+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 991232 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:29.670971+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68206592 unmapped: 991232 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:30.671112+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 983040 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:31.671283+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68214784 unmapped: 983040 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:32.671404+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 974848 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:33.671625+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68222976 unmapped: 974848 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:34.671781+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 958464 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:35.672101+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 958464 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:36.672280+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68239360 unmapped: 958464 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:37.672416+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 950272 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:38.672613+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 950272 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:39.672776+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68247552 unmapped: 950272 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:40.672917+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 942080 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:41.673116+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 942080 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:42.673309+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68255744 unmapped: 942080 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:43.673457+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:44.673603+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:45.673800+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:46.674015+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:47.674300+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:48.674431+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:49.674553+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:50.674681+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:51.674808+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:52.675056+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:53.675212+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:54.675369+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:55.675539+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:56.675695+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:57.676949+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:58.678221+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:59.678373+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:00.678706+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:01.678825+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:02.678979+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:03.679104+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:04.679254+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:05.679410+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:06.679662+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:07.679842+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:08.680007+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:09.680153+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:10.680358+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:11.681045+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:12.681182+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68263936 unmapped: 933888 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:13.681348+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:14.681488+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:15.681700+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:16.682238+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:17.682647+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:18.682908+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:19.683182+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:20.683357+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:21.683735+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:22.684077+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:23.684219+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:24.684348+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:25.684475+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:26.684633+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:27.684790+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:28.684955+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:29.685041+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:30.685212+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:31.685382+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:32.685562+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:33.685749+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:34.685901+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:35.686107+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:36.686261+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:37.686465+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:38.686644+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:39.686757+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:40.686879+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:41.686992+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:42.687154+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:43.687330+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:44.687486+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:45.687648+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:46.687796+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:47.687928+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:48.688178+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:49.688313+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:50.688424+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:51.688564+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68272128 unmapped: 925696 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:52.688753+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:53.688943+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:54.689133+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:55.689289+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:56.689447+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:57.689601+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:58.689722+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:59.689846+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:00.690000+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:01.690165+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:02.690294+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:03.690459+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:04.690678+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:05.690807+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:06.690958+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:07.691201+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:08.691351+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:09.691553+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:10.691757+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:11.691954+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:12.692128+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:13.692257+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:14.692395+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:15.692597+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:16.692812+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:17.693035+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:18.693257+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:19.693493+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:20.693695+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:21.693889+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:22.694086+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:23.694275+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:24.694507+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:25.694655+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:26.694882+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:27.695081+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68280320 unmapped: 917504 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:28.695334+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:29.695631+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:30.695805+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:31.695960+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:32.696112+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:33.696716+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:34.696909+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:35.697058+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:36.697229+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:37.697441+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:38.698202+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:39.698404+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:40.698613+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:41.698909+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:42.699174+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:43.699506+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:44.699838+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:45.700096+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:46.700357+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:47.700592+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:48.700806+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:49.701081+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:50.701323+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:51.701732+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:52.701975+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:53.702136+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:54.702305+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:55.702510+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:56.702669+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:57.702901+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:58.703149+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:59.703308+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:00.703518+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:01.703700+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:02.703872+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:03.704115+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:04.704330+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:05.704540+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:06.704855+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:07.705010+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:08.705338+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:09.705488+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:10.705677+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:11.705850+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:12.706118+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:13.706333+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68288512 unmapped: 909312 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:14.706502+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:15.706659+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:16.706879+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:17.707117+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:18.707458+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:19.707652+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:20.707814+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:21.707997+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:22.708215+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:23.708368+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:24.708523+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:25.708691+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:26.708810+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:27.708959+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:28.709158+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:29.709307+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:30.709516+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:31.709703+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:32.709854+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:33.710076+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:34.710284+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:35.710499+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:36.710739+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:37.710971+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:38.711207+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:39.711374+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:40.711512+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:41.711676+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:42.711854+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:43.712001+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:44.712144+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:45.712282+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:46.712486+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:47.712664+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:48.712874+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68296704 unmapped: 901120 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:49.713025+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: mgrc ms_handle_reset ms_handle_reset con 0x55cf1d7fe000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1553116858
Dec 03 21:43:48 compute-0 ceph-osd[87094]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1553116858,v1:192.168.122.100:6801/1553116858]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: get_auth_request con 0x55cf1e65f000 auth_method 0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: mgrc handle_mgr_configure stats_period=5
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68313088 unmapped: 884736 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:50.713212+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 ms_handle_reset con 0x55cf1ee92400 session 0x55cf1e2d7880
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 ms_handle_reset con 0x55cf1ee92800 session 0x55cf1e2416c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d878800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:51.713392+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:52.713630+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:53.713761+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:54.713945+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:55.714127+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:56.714279+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:57.714525+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:58.714773+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:59.714891+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:00.715057+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:01.715185+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:02.715351+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:03.715498+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:04.715623+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:05.715749+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:06.715974+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:07.716166+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:08.716625+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:09.716881+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:10.717239+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:11.717395+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:12.717550+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:13.717719+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:14.717879+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:15.718054+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:16.718168+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:17.718348+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:18.718605+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:19.718807+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:20.718976+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:21.719180+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:22.719474+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:23.719682+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:24.719841+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:25.719989+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:26.720166+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:27.720313+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:28.720522+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:29.720745+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:30.720906+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:31.721102+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:32.721292+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:33.721434+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:34.721657+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:35.721789+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:36.722055+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:37.722203+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:38.722401+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:39.722551+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:40.722836+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:41.722994+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:42.723132+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:43.723259+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:44.723642+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:45.723808+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:46.724000+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:47.724147+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:48.724338+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:49.724512+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:50.724678+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:51.725052+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:52.725254+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:53.725422+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:54.725595+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:55.725735+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:56.725834+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:57.726000+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:58.726185+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:59.726360+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:00.726494+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:01.726674+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:02.726812+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:03.726968+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:04.727101+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:05.727240+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:06.727394+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:07.727525+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:08.727705+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:09.727814+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:10.727910+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:11.728055+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:12.728187+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:13.728311+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:14.728451+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:15.728645+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:16.728789+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:17.728906+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:18.729046+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:19.729168+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:20.729400+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:21.729552+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:22.729713+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:23.729854+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:24.730061+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:25.760909+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:26.761028+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:27.761194+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:28.761404+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:29.761564+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:30.761738+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:31.761874+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:32.762080+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:33.762308+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:34.762537+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:35.762682+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:36.762911+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:37.763086+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:38.763276+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:39.763430+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:40.763649+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:41.763787+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:42.763926+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:43.764073+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:44.764239+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68575232 unmapped: 622592 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:45.764417+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:46.764643+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:47.764800+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:48.765003+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:49.765176+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:50.765398+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:51.765565+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:52.765774+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:53.765930+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:54.766703+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:55.766916+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:56.767155+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:57.767303+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:58.767520+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:59.767686+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:00.767876+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:01.768058+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:02.768289+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:03.768439+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:04.768635+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:05.768811+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:06.768988+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:07.769150+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:08.769441+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:09.769607+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:10.769771+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:11.769969+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:12.770135+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:13.770298+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:14.770472+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:15.770630+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:16.770761+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:17.770891+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:18.771103+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:19.771280+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:20.771502+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:21.771621+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:22.771729+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:23.771824+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:24.771917+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68517888 unmapped: 679936 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:25.772108+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:26.772272+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:27.772482+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:28.772657+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:29.772827+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:30.773023+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:31.773201+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:32.773357+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:33.773534+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:34.773711+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:35.773883+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:36.774043+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:37.774219+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:38.774380+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:39.774512+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:40.774725+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:41.774893+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:42.775086+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:43.775244+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:44.775391+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:45.775536+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:46.775811+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:47.775952+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:48.776196+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:49.776402+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:50.776620+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68526080 unmapped: 671744 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:51.776797+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 663552 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:52.776940+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 663552 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:53.777141+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 663552 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:54.777394+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68534272 unmapped: 663552 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:55.777618+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:56.777869+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:57.778075+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:58.778265+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:59.778534+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:00.778726+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:01.778932+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:02.779078+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:03.779256+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:04.779480+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:05.779751+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:06.779990+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:07.780243+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:08.780482+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:09.780643+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:10.780854+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:11.781035+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:12.781460+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:13.781637+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:14.781813+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:15.782004+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:16.782189+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:17.782383+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:18.782554+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:19.782764+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:20.782929+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:21.783149+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:22.783293+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:23.783421+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:24.783607+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:25.783761+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:26.784042+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:27.784248+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:28.784463+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:29.784621+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:30.784838+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:31.785065+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:32.785275+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:33.785415+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:34.785696+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:35.785965+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:36.786178+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:37.786328+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:38.786495+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:39.786665+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:40.786886+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:41.787152+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:42.787467+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:43.787636+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:44.787818+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:45.788010+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:46.788168+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:47.788315+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:48.788478+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:49.788639+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:50.788800+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:51.788978+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:52.789143+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:53.789307+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:54.789486+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:55.789646+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:56.789807+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:57.789970+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:58.790170+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:59.790351+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:00.790517+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:01.790690+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:02.790884+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:03.791036+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:04.791178+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:05.791374+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:06.791535+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:07.792185+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:08.792376+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:09.792667+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:10.792832+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:11.793023+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:12.793403+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:13.793650+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:14.793827+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:15.793979+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:16.794185+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:17.794466+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:18.794706+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:19.794883+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:20.795098+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:21.795251+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:22.795453+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:23.795643+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:24.795905+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:25.796076+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:26.796269+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:27.796489+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:28.796668+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:29.796796+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:30.796959+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:31.797129+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:32.797272+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:33.797480+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:34.797679+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:35.797841+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:36.797992+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:37.798187+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:38.798389+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:39.798539+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:40.798790+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:41.799345+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:42.799620+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:43.799814+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 4515 writes, 20K keys, 4515 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4515 writes, 505 syncs, 8.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed5a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.001       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55cf1bed58d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:44.800000+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:45.800252+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:46.800458+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:47.800680+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:48.800943+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:49.801148+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:50.801278+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:51.801543+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:52.801776+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:53.801925+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:54.802107+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:55.803117+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:56.803882+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:57.804191+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:58.804507+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:59.805328+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:00.805950+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:01.806156+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:02.806507+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:03.806673+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:04.806973+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:05.807533+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:06.807802+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:07.808117+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:08.808371+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:09.808641+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:10.808886+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:11.809097+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:12.809286+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:13.809499+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:14.809694+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:15.809914+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:16.810082+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:17.810263+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:18.810474+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:19.810671+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:20.810867+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:21.811069+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:22.811259+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:23.811492+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:24.811654+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:25.811806+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:26.811920+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:27.812850+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:28.813253+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:29.815675+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:30.816894+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:31.817229+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:32.817915+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:33.818370+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:34.818734+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:35.819347+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:36.819672+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:37.819811+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:38.820114+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:39.820358+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:40.820532+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:41.820719+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:42.821032+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:43.821384+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:44.821666+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:45.821973+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:46.822131+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:47.822343+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:48.822626+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:49.822831+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:50.823025+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:51.823210+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:52.823351+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:53.823507+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:54.823661+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:55.823824+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:56.824004+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:57.824197+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:58.824418+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:59.824683+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:00.824928+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:01.825157+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:02.825345+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:03.825544+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:04.825877+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68542464 unmapped: 655360 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:05.826142+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:06.826284+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:07.826517+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:08.826747+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:09.827047+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:10.827280+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:11.827445+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:12.827805+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:13.828212+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:14.828389+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 557290 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 heartbeat osd_stat(store_statfs(0x4fe0ad000/0x0/0x4ffc00000, data 0xb8057/0x11f000, compress 0x0/0x0/0x0, omap 0xad29, meta 0x1a252d7), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:15.828737+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:16.829181+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:17.829401+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:18.829635+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68550656 unmapped: 647168 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 1025.044555664s of 1025.062866211s, submitted: 10
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:19.829854+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 562450 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68706304 unmapped: 491520 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 63 heartbeat osd_stat(store_statfs(0x4fe0a7000/0x0/0x4ffc00000, data 0xb962c/0x123000, compress 0x0/0x0/0x0, omap 0xafb1, meta 0x1a2504f), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 63 handle_osd_map epochs [64,64], i have 63, src has [1,64]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:20.830025+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 68747264 unmapped: 450560 heap: 69197824 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 65 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1fb84700
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:21.830208+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 69083136 unmapped: 16900096 heap: 85983232 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:22.830398+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 69099520 unmapped: 16883712 heap: 85983232 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 65 heartbeat osd_stat(store_statfs(0x4fd89b000/0x0/0x4ffc00000, data 0x8bc287/0x92b000, compress 0x0/0x0/0x0, omap 0xb4cd, meta 0x1a24b33), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:23.830604+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 69099520 unmapped: 16883712 heap: 85983232 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 65 heartbeat osd_stat(store_statfs(0x4fd89b000/0x0/0x4ffc00000, data 0x8bc287/0x92b000, compress 0x0/0x0/0x0, omap 0xb4cd, meta 0x1a24b33), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:24.830781+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 682603 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 66 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1fbcb6c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:25.830992+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:26.831190+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:27.831342+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:28.831648+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 66 heartbeat osd_stat(store_statfs(0x4fcc2a000/0x0/0x4ffc00000, data 0x152d8a3/0x15a0000, compress 0x0/0x0/0x0, omap 0xbaaf, meta 0x1a24551), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:29.831862+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 685815 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 66 heartbeat osd_stat(store_statfs(0x4fcc2a000/0x0/0x4ffc00000, data 0x152d8a3/0x15a0000, compress 0x0/0x0/0x0, omap 0xbaaf, meta 0x1a24551), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 66 heartbeat osd_stat(store_statfs(0x4fcc2a000/0x0/0x4ffc00000, data 0x152d8a3/0x15a0000, compress 0x0/0x0/0x0, omap 0xbaaf, meta 0x1a24551), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:30.832057+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 66 handle_osd_map epochs [66,67], i have 66, src has [1,67]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.130791664s of 11.549943924s, submitted: 54
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:31.832218+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:32.832411+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:33.832632+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:34.832813+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:35.832950+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:36.833147+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:37.833357+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:38.833628+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:39.833775+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:40.833921+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:41.834080+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:42.834168+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:43.834342+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:44.834486+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:45.834658+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:46.834781+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:47.835031+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:48.835229+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:49.835346+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:50.835710+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:51.835938+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:52.836126+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:53.836353+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:54.836636+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:55.836877+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:56.837144+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:57.837335+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:58.837554+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:59.837811+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 688139 data_alloc: 218103808 data_used: 658
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:00.837978+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 heartbeat osd_stat(store_statfs(0x4fcc27000/0x0/0x4ffc00000, data 0x152ed53/0x15a3000, compress 0x0/0x0/0x0, omap 0xbcdd, meta 0x1a24323), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:01.838298+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 23904256 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 31.202196121s of 31.212696075s, submitted: 13
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:02.838454+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 67 handle_osd_map epochs [67,68], i have 67, src has [1,68]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 68 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1f791a40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 23699456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:03.838664+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 23699456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 68 heartbeat osd_stat(store_statfs(0x4fcc23000/0x0/0x4ffc00000, data 0x1530330/0x15a7000, compress 0x0/0x0/0x0, omap 0xc2b9, meta 0x1a23d47), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:04.838813+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 68 heartbeat osd_stat(store_statfs(0x4fcc23000/0x0/0x4ffc00000, data 0x1530330/0x15a7000, compress 0x0/0x0/0x0, omap 0xc2b9, meta 0x1a23d47), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 694334 data_alloc: 218103808 data_used: 677
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 23699456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:05.838934+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 22200320 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 68 heartbeat osd_stat(store_statfs(0x4fc424000/0x0/0x4ffc00000, data 0x1d30353/0x1da8000, compress 0x0/0x0/0x0, omap 0xc2b9, meta 0x1a23d47), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:06.839663+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 72425472 unmapped: 21954560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:07.839830+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 68 handle_osd_map epochs [69,69], i have 68, src has [1,69]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 69 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1fbe4e00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 72441856 unmapped: 21938176 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1e773c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:08.839967+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 69 heartbeat osd_stat(store_statfs(0x4f9c1f000/0x0/0x4ffc00000, data 0x4531923/0x45ab000, compress 0x0/0x0/0x0, omap 0xc53b, meta 0x1a23ac5), peers [0,2] op hist [1])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 69 handle_osd_map epochs [70,70], i have 69, src has [1,70]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 70 ms_handle_reset con 0x55cf1e773c00 session 0x55cf1e2b2000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 20766720 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:09.840081+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 71 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1e2408c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 713122 data_alloc: 218103808 data_used: 677
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 71 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1e704e00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 20643840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:10.840283+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 71 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1e704fc0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadb400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 71 ms_handle_reset con 0x55cf1fadb400 session 0x55cf1da9a8c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73916416 unmapped: 20463616 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:11.840451+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadfc00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74047488 unmapped: 20332544 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:12.840656+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 71 handle_osd_map epochs [72,72], i have 71, src has [1,72]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.184928894s of 10.620976448s, submitted: 112
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 72 ms_handle_reset con 0x55cf1fadfc00 session 0x55cf1f7916c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 20480000 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 72 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1dc25c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 72 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1dc24700
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:13.840823+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 72 heartbeat osd_stat(store_statfs(0x4fcc15000/0x0/0x4ffc00000, data 0x1535f4b/0x15b5000, compress 0x0/0x0/0x0, omap 0xd0d5, meta 0x1a22f2b), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadb400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 73900032 unmapped: 20480000 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdaa800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:14.841021+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 72 handle_osd_map epochs [73,73], i have 72, src has [1,73]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 73 ms_handle_reset con 0x55cf1fdaa800 session 0x55cf1e3c1880
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 73 ms_handle_reset con 0x55cf1fadb400 session 0x55cf1d5af180
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 723385 data_alloc: 218103808 data_used: 677
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74153984 unmapped: 20226048 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:15.841178+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 73 heartbeat osd_stat(store_statfs(0x4fcc13000/0x0/0x4ffc00000, data 0x1537959/0x15b7000, compress 0x0/0x0/0x0, omap 0xd866, meta 0x1a2279a), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 74 handle_osd_map epochs [74,75], i have 74, src has [1,75]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 75 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1fb44fc0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 75 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1ee96a80
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 75 heartbeat osd_stat(store_statfs(0x4fcc13000/0x0/0x4ffc00000, data 0x1537959/0x15b7000, compress 0x0/0x0/0x0, omap 0xd866, meta 0x1a2279a), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74637312 unmapped: 19742720 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:16.841378+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 19693568 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:17.841644+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 19693568 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:18.841915+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74686464 unmapped: 19693568 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:19.842163+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 75 heartbeat osd_stat(store_statfs(0x4fcc11000/0x0/0x4ffc00000, data 0x1539c56/0x15bb000, compress 0x0/0x0/0x0, omap 0xdf34, meta 0x1a220cc), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 725390 data_alloc: 218103808 data_used: 677
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74752000 unmapped: 19628032 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:20.842309+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 75 handle_osd_map epochs [76,76], i have 75, src has [1,76]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 76 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1e2d6700
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74760192 unmapped: 19619840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 76 heartbeat osd_stat(store_statfs(0x4fba70000/0x0/0x4ffc00000, data 0x1539c79/0x15bc000, compress 0x0/0x0/0x0, omap 0xdf34, meta 0x2bc20cc), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:21.842461+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 76 heartbeat osd_stat(store_statfs(0x4fba6b000/0x0/0x4ffc00000, data 0x153b262/0x15bf000, compress 0x0/0x0/0x0, omap 0xe1c1, meta 0x2bc1e3f), peers [0,2] op hist [0,0,0,0,0,0,1])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74776576 unmapped: 19603456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:22.842639+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 76 handle_osd_map epochs [76,77], i have 76, src has [1,77]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.756548882s of 10.034585953s, submitted: 104
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 77 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1fb85500
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74776576 unmapped: 19603456 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd4c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:23.842715+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 78 ms_handle_reset con 0x55cf1fdd4c00 session 0x55cf1fb44540
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74809344 unmapped: 19570688 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd4800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:24.842889+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 78 handle_osd_map epochs [79,79], i have 78, src has [1,79]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 79 ms_handle_reset con 0x55cf1fdd4800 session 0x55cf1e2b3a40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 747169 data_alloc: 218103808 data_used: 677
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74817536 unmapped: 19562496 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:25.843012+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 79 handle_osd_map epochs [80,80], i have 79, src has [1,80]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 80 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1d865880
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 74817536 unmapped: 19562496 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 80 handle_osd_map epochs [81,81], i have 80, src has [1,81]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 81 heartbeat osd_stat(store_statfs(0x4fba60000/0x0/0x4ffc00000, data 0x153fa30/0x15cc000, compress 0x0/0x0/0x0, omap 0xeb96, meta 0x2bc146a), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:26.843186+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 81 handle_osd_map epochs [82,82], i have 81, src has [1,82]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 82 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1ee97a40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 82 heartbeat osd_stat(store_statfs(0x4fba54000/0x0/0x4ffc00000, data 0x15424fc/0x15d4000, compress 0x0/0x0/0x0, omap 0xf4d9, meta 0x2bc0b27), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 18595840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:27.843372+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 18595840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:28.843562+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 82 heartbeat osd_stat(store_statfs(0x4fba4f000/0x0/0x4ffc00000, data 0x1543ac9/0x15d7000, compress 0x0/0x0/0x0, omap 0xf87b, meta 0x2bc0785), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75784192 unmapped: 18595840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:29.843745+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 761402 data_alloc: 218103808 data_used: 677
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75694080 unmapped: 18685952 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:30.843912+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 83 heartbeat osd_stat(store_statfs(0x4fba55000/0x0/0x4ffc00000, data 0x1543ac9/0x15d7000, compress 0x0/0x0/0x0, omap 0xfaa1, meta 0x2bc055f), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 83 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1e75f6c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75833344 unmapped: 18546688 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:31.844069+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 83 handle_osd_map epochs [84,84], i have 83, src has [1,84]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 84 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1e2b3a40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 75997184 unmapped: 18382848 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1e773000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:32.844225+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 9.886656761s of 10.113715172s, submitted: 149
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 85 ms_handle_reset con 0x55cf1e773000 session 0x55cf1fbcac40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76029952 unmapped: 18350080 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 85 heartbeat osd_stat(store_statfs(0x4fba4f000/0x0/0x4ffc00000, data 0x15466d8/0x15db000, compress 0x0/0x0/0x0, omap 0x105a5, meta 0x2bbfa5b), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:33.844406+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 86 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1dc24c40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76111872 unmapped: 18268160 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:34.844555+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 770245 data_alloc: 218103808 data_used: 12860
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 87 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1f791a40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 18145280 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:35.845187+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 88 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1ee96e00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fba49000/0x0/0x4ffc00000, data 0x154a328/0x15e0000, compress 0x0/0x0/0x0, omap 0x10d34, meta 0x2bbf2cc), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 18096128 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:36.845327+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 18096128 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:37.845548+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 88 heartbeat osd_stat(store_statfs(0x4fba44000/0x0/0x4ffc00000, data 0x154b926/0x15e2000, compress 0x0/0x0/0x0, omap 0x10fef, meta 0x2bbf011), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 18079744 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:38.845886+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 18079744 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:39.846105+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 772529 data_alloc: 218103808 data_used: 12860
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 18079744 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:40.846317+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 88 handle_osd_map epochs [88,89], i have 88, src has [1,89]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:41.846490+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 89 heartbeat osd_stat(store_statfs(0x4fba45000/0x0/0x4ffc00000, data 0x154cdf2/0x15e5000, compress 0x0/0x0/0x0, omap 0x1120b, meta 0x2bbedf5), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:42.846663+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:43.846856+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:44.847049+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 774677 data_alloc: 218103808 data_used: 20982
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 18194432 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:45.847231+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 12.887969017s of 13.055091858s, submitted: 117
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 90 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1ee97340
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:46.847425+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:47.847631+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 90 heartbeat osd_stat(store_statfs(0x4fba41000/0x0/0x4ffc00000, data 0x154e41a/0x15e9000, compress 0x0/0x0/0x0, omap 0x11621, meta 0x2bbe9df), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:48.847808+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:49.847989+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 779189 data_alloc: 218103808 data_used: 20982
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 18202624 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:50.848147+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd4c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 90 ms_handle_reset con 0x55cf1fdd4c00 session 0x55cf1fb85dc0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76324864 unmapped: 18055168 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:51.848258+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 92 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1fb841c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 92 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1e2b2000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 92 ms_handle_reset con 0x55cf1fdd5000 session 0x55cf1fbb0fc0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:52.848418+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 92 heartbeat osd_stat(store_statfs(0x4fba37000/0x0/0x4ffc00000, data 0x155105d/0x15f1000, compress 0x0/0x0/0x0, omap 0x11ad7, meta 0x2bbe529), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:53.848644+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:54.848846+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 790116 data_alloc: 218103808 data_used: 21017
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:55.849024+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:56.849199+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:57.849325+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 92 heartbeat osd_stat(store_statfs(0x4fba37000/0x0/0x4ffc00000, data 0x155105d/0x15f1000, compress 0x0/0x0/0x0, omap 0x11ad7, meta 0x2bbe529), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:58.849517+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:59.849677+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 790116 data_alloc: 218103808 data_used: 21017
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:00.849862+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76365824 unmapped: 18014208 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 15.561586380s of 15.613102913s, submitted: 36
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:01.850069+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 17956864 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:02.850256+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 93 heartbeat osd_stat(store_statfs(0x4fba36000/0x0/0x4ffc00000, data 0x155250d/0x15f4000, compress 0x0/0x0/0x0, omap 0x11d8c, meta 0x2bbe274), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 17956864 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:03.850453+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 93 handle_osd_map epochs [95,95], i have 93, src has [1,95]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 93 handle_osd_map epochs [94,95], i have 93, src has [1,95]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 95 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1e2d61c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76480512 unmapped: 17899520 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:04.850623+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf2001c000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 800233 data_alloc: 218103808 data_used: 21017
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76496896 unmapped: 17883136 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:05.850801+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 95 heartbeat osd_stat(store_statfs(0x4fba2e000/0x0/0x4ffc00000, data 0x15550fb/0x15fa000, compress 0x0/0x0/0x0, omap 0x121e2, meta 0x2bbde1e), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76496896 unmapped: 17883136 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:06.850978+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 76496896 unmapped: 17883136 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:07.851130+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf2001a800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 96 ms_handle_reset con 0x55cf2001a800 session 0x55cf1ee96700
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 16654336 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:08.851823+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 97 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1e2408c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 16572416 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:09.852346+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 97 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1ee97dc0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 810508 data_alloc: 218103808 data_used: 21083
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 16572416 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 97 ms_handle_reset con 0x55cf1fdd5000 session 0x55cf1fbca1c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:10.852772+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 98 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1fb45880
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 98 heartbeat osd_stat(store_statfs(0x4fba26000/0x0/0x4ffc00000, data 0x1557cfb/0x1602000, compress 0x0/0x0/0x0, omap 0x12956, meta 0x2bbd6aa), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77725696 unmapped: 16654336 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:11.853116+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf2001ac00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.186281204s of 10.288720131s, submitted: 73
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 99 ms_handle_reset con 0x55cf2001ac00 session 0x55cf1d5ae1c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 16564224 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:12.853386+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1fb85880
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 16547840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:13.853643+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 16547840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:14.853774+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 heartbeat osd_stat(store_statfs(0x4fba1c000/0x0/0x4ffc00000, data 0x155bf8b/0x160a000, compress 0x0/0x0/0x0, omap 0x13542, meta 0x2bbcabe), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 818980 data_alloc: 218103808 data_used: 21083
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 16547840 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:15.854919+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77856768 unmapped: 16523264 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:16.855101+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fb24c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1fb24c00 session 0x55cf1fb856c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdd5800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1fdd5800 session 0x55cf1e2d7500
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1fdd5000 session 0x55cf1fb84380
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1f7a6800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1f7a6800 session 0x55cf1fb85180
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1fb84000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadfc00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1fadfc00 session 0x55cf1fb84540
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1f7fd400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1f7fd400 session 0x55cf1ee976c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77873152 unmapped: 16506880 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:17.855231+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1e75e540
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 77873152 unmapped: 16506880 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:18.855437+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1e273000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1e273000 session 0x55cf1fb5a000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1fb45180
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1fb9a1c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 16203776 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:19.855648+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1f7fd400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 823271 data_alloc: 218103808 data_used: 21083
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 heartbeat osd_stat(store_statfs(0x4fb9fc000/0x0/0x4ffc00000, data 0x157ffdd/0x1630000, compress 0x0/0x0/0x0, omap 0x13542, meta 0x2bbcabe), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 16203776 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:20.855793+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 16203776 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:21.856075+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 16203776 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:22.856444+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadfc00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 101 ms_handle_reset con 0x55cf1fadfc00 session 0x55cf1dc24380
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.026945114s of 11.134009361s, submitted: 77
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 102 ms_handle_reset con 0x55cf1faa4000 session 0x55cf1f848380
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdaa000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 102 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1e3c0c40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 102 ms_handle_reset con 0x55cf1fdaa000 session 0x55cf1dc256c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78766080 unmapped: 15613952 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 102 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1d5aea80
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:23.856624+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 102 handle_osd_map epochs [102,103], i have 102, src has [1,103]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 103 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1fbca700
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 15605760 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:24.856791+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 103 heartbeat osd_stat(store_statfs(0x4fb9ed000/0x0/0x4ffc00000, data 0x15840c3/0x163a000, compress 0x0/0x0/0x0, omap 0x13c4f, meta 0x2bbc3b1), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 835465 data_alloc: 218103808 data_used: 23197
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1ee97c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 15605760 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:25.856940+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1faa4000 session 0x55cf1e75fdc0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1faa4000 session 0x55cf1f790c40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1da9b880
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78774272 unmapped: 15605760 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:26.857212+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1eea5000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 104 heartbeat osd_stat(store_statfs(0x4fb9e8000/0x0/0x4ffc00000, data 0x15856ac/0x163d000, compress 0x0/0x0/0x0, omap 0x13f40, meta 0x2bbc0c0), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 104 ms_handle_reset con 0x55cf1eea5000 session 0x55cf1e2d7340
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78790656 unmapped: 15589376 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:27.857356+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1faa4400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 104 handle_osd_map epochs [104,105], i have 105, src has [1,105]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 15556608 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 105 ms_handle_reset con 0x55cf1faa4400 session 0x55cf1da9b180
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:28.857549+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 15556608 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:29.857777+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 105 ms_handle_reset con 0x55cf1eea4000 session 0x55cf1e705500
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 105 ms_handle_reset con 0x55cf1f7fd400 session 0x55cf1d5af6c0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 839964 data_alloc: 218103808 data_used: 24430
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78708736 unmapped: 15671296 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 106 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1fb44540
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:30.857966+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78716928 unmapped: 15663104 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:31.858126+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 106 heartbeat osd_stat(store_statfs(0x4fba0e000/0x0/0x4ffc00000, data 0x15642c4/0x161c000, compress 0x0/0x0/0x0, omap 0x146e3, meta 0x2bbb91d), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78716928 unmapped: 15663104 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:32.858300+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 107 heartbeat osd_stat(store_statfs(0x4fba0b000/0x0/0x4ffc00000, data 0x1565790/0x161f000, compress 0x0/0x0/0x0, omap 0x149b4, meta 0x2bbb64c), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78716928 unmapped: 15663104 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:33.858468+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 10.730767250s of 10.873538971s, submitted: 112
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 107 ms_handle_reset con 0x55cf1fdd5400 session 0x55cf1e2b2540
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 107 ms_handle_reset con 0x55cf2001c000 session 0x55cf1ee96c40
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadfc00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 107 ms_handle_reset con 0x55cf1fadfc00 session 0x55cf1fbb1500
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:34.858685+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1d879c00
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 842323 data_alloc: 218103808 data_used: 22894
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 107 ms_handle_reset con 0x55cf1d879c00 session 0x55cf1d5afdc0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdaa400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:35.858837+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 108 ms_handle_reset con 0x55cf1fdaa400 session 0x55cf1d5ae700
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:36.858999+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:37.859166+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 108 heartbeat osd_stat(store_statfs(0x4fba0a000/0x0/0x4ffc00000, data 0x1566d94/0x1620000, compress 0x0/0x0/0x0, omap 0x1509e, meta 0x2bbaf62), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:38.859357+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:39.859531+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 843615 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:40.859742+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 108 heartbeat osd_stat(store_statfs(0x4fba0a000/0x0/0x4ffc00000, data 0x1566d94/0x1620000, compress 0x0/0x0/0x0, omap 0x1509e, meta 0x2bbaf62), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:41.859940+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:42.860171+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:43.860369+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:44.860560+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 846389 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:45.860770+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 109 heartbeat osd_stat(store_statfs(0x4fba07000/0x0/0x4ffc00000, data 0x1568260/0x1623000, compress 0x0/0x0/0x0, omap 0x152b5, meta 0x2bbad4b), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 109 handle_osd_map epochs [109,110], i have 110, src has [1,110]
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore(/var/lib/ceph/osd/ceph-1) _kv_sync_thread utilization: idle 11.690909386s of 12.290042877s, submitted: 67
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:46.860992+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:47.861173+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:48.861407+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:49.861692+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:50.861859+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:51.862010+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:52.862195+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:53.862327+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:54.862495+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:55.862622+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _renew_subs
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:56.862816+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:57.862922+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:58.863129+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:59.863290+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:00.863475+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:01.863664+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:02.863863+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:03.864069+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:04.864265+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:05.864440+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:06.864666+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:07.864831+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:08.865030+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:09.865255+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:10.865424+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:11.865587+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:12.865741+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:13.866776+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:14.866918+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:15.867105+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:16.867297+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:17.867461+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:18.867752+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:19.867991+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:20.868167+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:21.868333+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:22.868531+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:23.868698+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:24.868848+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:25.869043+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:26.869377+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:27.869550+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:28.869754+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78618624 unmapped: 15761408 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:29.869938+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:30.870114+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:31.870284+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:32.870501+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:33.870647+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:34.870802+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:35.870969+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:36.871101+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:37.871241+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:38.871418+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:39.871621+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:40.871830+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:41.872065+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1164: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:42.872309+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:43.872494+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:44.872666+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:45.872849+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:46.872998+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:47.873151+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:48.873337+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:49.873468+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread fragmentation_score=0.000123 took=0.000012s
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:50.873629+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78569472 unmapped: 15810560 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:51.873750+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:52.873913+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:53.874076+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:54.874255+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:55.874416+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:56.874614+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:57.874759+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:58.874987+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:59.875143+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:00.875281+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:01.875403+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:02.875546+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:03.875635+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:04.875792+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:05.875949+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:06.878408+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:07.878534+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78561280 unmapped: 15818752 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:08.878718+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78635008 unmapped: 15745024 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:09.878852+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'config show' '{prefix=config show}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 15548416 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:10.878996+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 15343616 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:11.879141+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 15343616 heap: 94380032 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:12.879260+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'perf dump' '{prefix=perf dump}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'perf schema' '{prefix=perf schema}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:13.879387+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:14.879552+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:15.879822+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:16.879942+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:17.880055+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:18.880187+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:19.880365+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:20.880544+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:21.880702+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:22.880815+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:23.880940+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:24.881072+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:25.881197+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:26.881320+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:27.881447+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:28.881608+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:29.881719+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:30.881844+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:31.881975+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:32.882103+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:33.882270+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:34.882401+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:35.882615+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:36.882737+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:37.882849+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:38.883050+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:39.883176+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:40.883351+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:41.883490+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:42.883630+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:43.883838+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:44.883987+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:45.884147+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:46.884328+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:47.884537+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:48.884841+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:49.885170+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:50.885451+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:51.885734+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:52.885976+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:53.886255+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:54.886491+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:55.886614+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:56.886754+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:57.886919+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:58.887196+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:59.887660+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:00.887975+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:01.888267+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:02.888512+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:03.888749+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:04.889006+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:05.889297+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:06.889454+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:07.889656+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:08.889842+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:09.889981+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:10.890114+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:11.890257+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:12.890432+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:13.890613+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:14.890793+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:15.891022+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:16.891192+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:17.891328+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:18.891556+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:19.891762+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:20.891890+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79110144 unmapped: 26312704 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:21.892154+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:22.893481+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:23.893878+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:24.894638+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:25.895159+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:26.895662+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:27.895932+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:28.896174+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:29.896617+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:30.896917+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:31.897162+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:32.897530+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:33.897855+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:34.898053+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:35.898374+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:36.898672+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78979072 unmapped: 26443776 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:37.898888+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:38.899239+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:39.899622+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:40.899911+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:41.900180+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:42.900485+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:43.900754+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:44.900932+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:45.901165+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:46.901350+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:47.901558+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:48.901896+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:49.902137+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:50.902336+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:51.902505+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:52.902661+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:53.902890+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:54.903321+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:55.903668+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:56.904266+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:57.904610+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78987264 unmapped: 26435584 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:58.904913+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:59.905081+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:00.905345+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:01.905647+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:02.905802+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:03.906039+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:04.906288+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:05.906459+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:06.906696+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:07.906959+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:08.907354+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:09.907657+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:10.907835+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:11.907996+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:12.908138+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:13.908301+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78995456 unmapped: 26427392 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:14.908518+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:15.908674+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:16.908790+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:17.908923+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:18.909137+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:19.909281+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:20.909475+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:21.909750+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:22.909922+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:23.910102+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:24.910273+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:25.910500+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:26.910712+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:27.910880+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:28.911160+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:29.911250+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79003648 unmapped: 26419200 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:30.911435+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:31.911672+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:32.911819+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:33.911974+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:34.912169+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:35.912396+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:36.912655+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:37.912819+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:38.913008+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:39.913140+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:40.913350+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:41.913654+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:42.913868+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:43.914023+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:44.914210+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:45.914453+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:46.914643+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:47.914778+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:48.914967+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:49.915109+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:50.915342+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:51.915504+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:52.915677+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:53.915794+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:54.915963+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:55.916156+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:56.916322+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:57.916484+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:58.916711+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:59.916863+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:00.917037+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79020032 unmapped: 26402816 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:01.917262+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:02.917465+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:03.917639+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:04.917838+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:05.918079+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:06.918287+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:07.918446+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:08.918646+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:09.918811+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:10.918941+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:11.919103+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:12.919248+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:13.919449+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:14.919741+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79028224 unmapped: 26394624 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:15.919937+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:16.920105+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:17.920230+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:18.920398+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:19.920532+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:20.920681+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:21.920896+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:22.921039+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:23.921222+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:24.921365+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:25.921537+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:26.921684+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:27.921910+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:28.922177+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:29.922342+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:30.925171+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:31.926419+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:32.926636+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:33.926949+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:34.927174+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:35.927680+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:36.928422+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79036416 unmapped: 26386432 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:37.929084+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:38.929362+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:39.929627+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:40.929792+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:41.929954+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:42.930130+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:43.930330+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:44.930495+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:45.930632+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:46.930789+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:47.930996+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:48.931183+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:49.931411+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:50.931624+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:51.931771+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:52.931945+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:53.932129+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:54.932287+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:55.932560+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:56.932830+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:57.932996+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:58.933249+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:59.933561+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:00.933871+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:01.934076+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:02.934226+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:03.934388+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:04.934536+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:05.934677+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:06.934851+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:07.935059+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:08.935329+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:09.935545+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:10.935877+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:11.936150+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:12.936390+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:13.936613+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:14.936822+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:15.936992+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:16.937142+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:17.937362+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:18.937643+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:19.937854+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:20.938092+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:21.938300+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:22.938562+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:23.938795+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:24.938960+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:25.939154+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:26.939354+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:27.939517+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:28.939744+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:29.939894+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:30.940033+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:31.940175+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:32.940363+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:33.940633+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:34.940876+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:35.941015+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:36.941274+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:37.941466+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:38.941656+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:39.941829+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:40.941978+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:41.942159+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:42.942303+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:43.942485+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 6189 writes, 25K keys, 6189 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 6189 writes, 1252 syncs, 4.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1674 writes, 4716 keys, 1674 commit groups, 1.0 writes per commit group, ingest: 2.71 MB, 0.00 MB/s
                                           Interval WAL: 1674 writes, 747 syncs, 2.24 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:44.942706+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:45.942868+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:46.943038+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:47.943202+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:48.943422+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 ms_handle_reset con 0x55cf1d0b9c00 session 0x55cf1befa000
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fdaa800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:49.943718+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:50.943919+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 ms_handle_reset con 0x55cf1d879000 session 0x55cf1e2b2a80
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf1fadb400
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 ms_handle_reset con 0x55cf1d878800 session 0x55cf1befb180
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: handle_auth_request added challenge on 0x55cf2001a800
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:51.944138+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:52.944360+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:53.944558+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:54.944751+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:55.944969+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:56.945166+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:57.945332+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:58.945543+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79044608 unmapped: 26378240 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:59.945698+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:00.945885+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:01.946075+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:02.946312+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:03.946602+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:04.946852+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:05.947067+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:06.947265+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:07.947443+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:08.947685+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:09.947831+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:10.947972+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:11.948168+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:12.948323+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:13.948545+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:14.948777+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:15.948942+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:16.949155+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:17.949297+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79052800 unmapped: 26370048 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:18.949483+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:19.949629+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:20.949772+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:21.949919+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:22.950109+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:23.950289+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:24.950449+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:25.950631+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:26.950844+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:27.951025+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:28.951238+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:29.951383+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:30.951632+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:31.951855+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:32.952055+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:33.952188+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:34.952365+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:35.952523+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:36.952679+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:37.952830+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:38.953043+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:39.953180+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:40.954225+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:41.954641+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:42.955094+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:43.955362+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:44.955613+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:45.955771+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:46.956138+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:47.956308+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:48.956662+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:49.956897+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:50.957115+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:51.957427+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:52.957648+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:53.957880+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:54.958040+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79060992 unmapped: 26361856 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:55.958237+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:56.958451+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:57.958657+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:58.958819+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:59.958968+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:00.959120+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:01.959370+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:02.959644+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:03.959851+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:04.960026+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79069184 unmapped: 26353664 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:05.960192+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:06.960335+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:07.960499+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:08.960709+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:09.960854+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:10.961023+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:11.961174+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:12.961298+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:13.961438+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:14.961631+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79077376 unmapped: 26345472 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:15.961808+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 26337280 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:16.962011+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 26337280 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:17.962167+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 26337280 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:18.962313+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 26337280 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:19.962470+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79085568 unmapped: 26337280 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:20.962628+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:21.962818+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:22.962985+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:23.963289+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:24.963445+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:25.963685+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:26.963811+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:27.963970+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:28.964159+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:29.964321+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:30.964444+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:31.964636+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:32.964787+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:33.964917+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:34.965059+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:35.965254+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:36.965480+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:37.965676+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:38.965875+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:39.966089+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79093760 unmapped: 26329088 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:40.966237+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 26320896 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:41.966385+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 26320896 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:42.966512+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 26320896 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:43.966639+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79101952 unmapped: 26320896 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:44.966755+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:45.966893+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:46.967101+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:47.967293+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:48.967731+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:49.967967+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:50.968151+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:51.968359+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:52.968521+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:53.968678+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:54.968767+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:55.968915+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:56.969072+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:57.969212+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:58.969430+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:59.969556+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:00.969729+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:01.969899+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:02.970061+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:03.970319+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:04.970524+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78954496 unmapped: 26468352 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:05.970664+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:06.970914+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:07.971179+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:08.971388+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:09.971624+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:10.971835+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:11.971994+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:12.972205+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:13.972366+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:14.972605+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:15.973881+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:16.975551+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:17.976426+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:18.977816+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:19.978026+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:20.978815+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:21.979418+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:22.979837+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:23.980506+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:24.981305+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:25.981862+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:26.982111+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:27.982560+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:28.983343+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:29.983796+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:30.984100+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:31.984529+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:32.984890+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:33.985228+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:34.985468+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:35.985698+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:36.985902+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:37.986197+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:38.986465+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:39.986679+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:40.986871+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:41.987116+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:42.987369+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:43.987618+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78823424 unmapped: 26599424 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:44.987943+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:45.988155+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:46.988282+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:47.988435+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:48.988627+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:49.988782+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:50.988939+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:51.989228+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:52.989408+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:53.989651+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:54.990644+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:55.991147+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:56.991596+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:57.991795+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:58.992620+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:59.993059+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:00.993638+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:01.994113+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:02.994270+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:03.994434+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:04.994648+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:05.994876+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:06.995076+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:07.995420+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:08.995819+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:09.996107+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:10.996432+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:11.996834+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:12.997231+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:13.997429+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:14.997704+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:15.997994+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:16.998174+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:17.998425+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:18.998685+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:19.999005+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:20.999347+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:21.999516+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:23.000149+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:24.000742+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:25.001257+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:26.001719+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:27.001867+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:28.002205+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:29.002668+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:30.002928+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:31.003177+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:32.003809+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:33.004289+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:34.004641+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:35.005165+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:36.005486+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:37.005691+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:38.005944+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:39.006209+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:40.006433+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:41.006644+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:42.006794+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:43.007014+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:44.007272+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:45.007514+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:46.007724+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:47.008041+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:48.008235+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:49.008516+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:50.008752+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:51.008982+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:52.009129+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:53.009284+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:54.009446+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:55.009683+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:56.009890+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:57.010076+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:58.010247+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:59.010621+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:00.010797+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:01.010948+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:02.011120+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:03.011308+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:04.011506+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:05.011618+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:06.011753+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:07.011876+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:08.012036+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:09.012371+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:10.012538+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:11.012723+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:12.012892+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:13.013038+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:48 compute-0 ceph-osd[87094]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:48 compute-0 ceph-osd[87094]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 849163 data_alloc: 218103808 data_used: 22793
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:14.013249+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:15.013389+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78831616 unmapped: 26591232 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: osd.1 110 heartbeat osd_stat(store_statfs(0x4fba04000/0x0/0x4ffc00000, data 0x1569710/0x1626000, compress 0x0/0x0/0x0, omap 0x154ce, meta 0x2bbab32), peers [0,2] op hist [])
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'config show' '{prefix=config show}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:16.013548+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 78921728 unmapped: 26501120 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:17.013734+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: tick
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_tickets
Dec 03 21:43:48 compute-0 ceph-osd[87094]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:18.013869+0000)
Dec 03 21:43:48 compute-0 ceph-osd[87094]: prioritycache tune_memory target: 4294967296 mapped: 79011840 unmapped: 26411008 heap: 105422848 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:48 compute-0 ceph-osd[87094]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:43:48 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 03 21:43:48 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/840509899' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 03 21:43:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:43:48.951 151937 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 03 21:43:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:43:48.952 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 03 21:43:48 compute-0 ovn_metadata_agent[151932]: 2025-12-03 21:43:48.952 151937 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 03 21:43:48 compute-0 rsyslogd[1006]: imjournal from <np0005544708:ceph-osd>: begin to drop messages due to rate-limiting
Dec 03 21:43:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 03 21:43:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1134410792' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 03 21:43:49 compute-0 ceph-mon[75204]: from='client.15100 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:49 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2129293945' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Dec 03 21:43:49 compute-0 ceph-mon[75204]: from='client.15104 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:49 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/840509899' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Dec 03 21:43:49 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1134410792' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Dec 03 21:43:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 03 21:43:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/629291703' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 03 21:43:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 03 21:43:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1276346780' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 03 21:43:49 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 03 21:43:49 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2356564469' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 03 21:43:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 03 21:43:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/470178597' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 03 21:43:50 compute-0 ceph-mon[75204]: pgmap v1164: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:50 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/629291703' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Dec 03 21:43:50 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1276346780' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Dec 03 21:43:50 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2356564469' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Dec 03 21:43:50 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/470178597' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Dec 03 21:43:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #54. Immutable memtables: 0.
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.173633) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 54
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798230173702, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1413, "num_deletes": 506, "total_data_size": 1151660, "memory_usage": 1177040, "flush_reason": "Manual Compaction"}
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #55: started
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798230184333, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 55, "file_size": 1047974, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22668, "largest_seqno": 24080, "table_properties": {"data_size": 1041829, "index_size": 2836, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 17183, "raw_average_key_size": 19, "raw_value_size": 1027156, "raw_average_value_size": 1163, "num_data_blocks": 127, "num_entries": 883, "num_filter_entries": 883, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764798111, "oldest_key_time": 1764798111, "file_creation_time": 1764798230, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 55, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 10733 microseconds, and 5883 cpu microseconds.
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.184378) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #55: 1047974 bytes OK
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.184396) [db/memtable_list.cc:519] [default] Level-0 commit table #55 started
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.185641) [db/memtable_list.cc:722] [default] Level-0 commit table #55: memtable #1 done
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.185662) EVENT_LOG_v1 {"time_micros": 1764798230185656, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.185682) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1144119, prev total WAL file size 1144119, number of live WAL files 2.
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000051.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.186407) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353032' seq:72057594037927935, type:22 .. '6C6F676D00373533' seq:0, type:0; will stop at (end)
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [55(1023KB)], [53(6769KB)]
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798230186469, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [55], "files_L6": [53], "score": -1, "input_data_size": 7980082, "oldest_snapshot_seqno": -1}
Dec 03 21:43:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 03 21:43:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1430012524' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #56: 4316 keys, 5844006 bytes, temperature: kUnknown
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798230252061, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 56, "file_size": 5844006, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 5814237, "index_size": 17883, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10821, "raw_key_size": 105717, "raw_average_key_size": 24, "raw_value_size": 5735770, "raw_average_value_size": 1328, "num_data_blocks": 753, "num_entries": 4316, "num_filter_entries": 4316, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1764796079, "oldest_key_time": 0, "file_creation_time": 1764798230, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d83d641b-0db7-44b5-9540-349f4c36f664", "db_session_id": "YRQHTOJ9E78VAMDNI6U1", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.252282) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 5844006 bytes
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.253448) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 121.6 rd, 89.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 6.6 +0.0 blob) out(5.6 +0.0 blob), read-write-amplify(13.2) write-amplify(5.6) OK, records in: 5333, records dropped: 1017 output_compression: NoCompression
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.253466) EVENT_LOG_v1 {"time_micros": 1764798230253457, "job": 28, "event": "compaction_finished", "compaction_time_micros": 65649, "compaction_time_cpu_micros": 26876, "output_level": 6, "num_output_files": 1, "total_output_size": 5844006, "num_input_records": 5333, "num_output_records": 4316, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000055.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798230253794, "job": 28, "event": "table_file_deletion", "file_number": 55}
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-0/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: EVENT_LOG_v1 {"time_micros": 1764798230255175, "job": 28, "event": "table_file_deletion", "file_number": 53}
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.186302) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.255261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.255267) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.255269) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.255270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:43:50 compute-0 ceph-mon[75204]: rocksdb: (Original Log Time 2025/12/03-21:43:50.255272) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 03 21:43:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec 03 21:43:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3887995334' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 03 21:43:50 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1165: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:50 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 03 21:43:50 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/4084817237' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 03 21:43:51 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3694439618' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 03 21:43:51 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/689140745' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1430012524' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3887995334' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/4084817237' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3694439618' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/689140745' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 03 21:43:51 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3860950218' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 03 21:43:51 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 03 21:43:51 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1838690655' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 03 21:43:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:43:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:43:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:43:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:43:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] scanning for idle connections..
Dec 03 21:43:51 compute-0 ceph-mgr[75500]: [volumes INFO mgr_util] cleaning up connections: []
Dec 03 21:43:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 03 21:43:52 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2061338019' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 03 21:43:52 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 03 21:43:52 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/504157317' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 03 21:43:52 compute-0 ceph-mon[75204]: pgmap v1165: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:52 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3860950218' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Dec 03 21:43:52 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1838690655' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Dec 03 21:43:52 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2061338019' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Dec 03 21:43:52 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/504157317' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Dec 03 21:43:52 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1166: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:52 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15136 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:52 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15138 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:53 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15140 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:53 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15142 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:12.020892+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 65 sent 63 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:41.276897+0000 osd.0 (osd.0) 64 : cluster [DBG] 7.f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:41.287254+0000 osd.0 (osd.0) 65 : cluster [DBG] 7.f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 65)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:41.276897+0000 osd.0 (osd.0) 64 : cluster [DBG] 7.f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:41.287254+0000 osd.0 (osd.0) 65 : cluster [DBG] 7.f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:13.021166+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:14.021429+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 794624 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:15.021638+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 455105 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:16.021832+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 786432 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:17.022052+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 67 sent 65 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:46.158534+0000 osd.0 (osd.0) 66 : cluster [DBG] 2.f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:46.169083+0000 osd.0 (osd.0) 67 : cluster [DBG] 2.f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 67)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:46.158534+0000 osd.0 (osd.0) 66 : cluster [DBG] 2.f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:46.169083+0000 osd.0 (osd.0) 67 : cluster [DBG] 2.f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 778240 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:18.022310+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:19.022521+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 770048 heap: 66052096 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:20.022717+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 459927 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65257472 unmapped: 1843200 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:21.022867+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 69 sent 67 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:50.216276+0000 osd.0 (osd.0) 68 : cluster [DBG] 5.5 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:50.226827+0000 osd.0 (osd.0) 69 : cluster [DBG] 5.5 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 69)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:50.216276+0000 osd.0 (osd.0) 68 : cluster [DBG] 5.5 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:50.226827+0000 osd.0 (osd.0) 69 : cluster [DBG] 5.5 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65265664 unmapped: 1835008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:22.023141+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.936560631s of 10.947173119s, submitted: 6
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 1826816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:23.023319+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 71 sent 69 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:52.223376+0000 osd.0 (osd.0) 70 : cluster [DBG] 7.6 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:52.233798+0000 osd.0 (osd.0) 71 : cluster [DBG] 7.6 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 71)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:52.223376+0000 osd.0 (osd.0) 70 : cluster [DBG] 7.6 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:52.233798+0000 osd.0 (osd.0) 71 : cluster [DBG] 7.6 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 1826816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:24.023636+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65273856 unmapped: 1826816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:25.023818+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 462338 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 1818624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:26.023943+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65282048 unmapped: 1818624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:27.024110+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65290240 unmapped: 1810432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:28.024250+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65298432 unmapped: 1802240 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:29.024458+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 73 sent 71 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:58.273350+0000 osd.0 (osd.0) 72 : cluster [DBG] 2.2 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:11:58.284111+0000 osd.0 (osd.0) 73 : cluster [DBG] 2.2 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 73)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:58.273350+0000 osd.0 (osd.0) 72 : cluster [DBG] 2.2 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:11:58.284111+0000 osd.0 (osd.0) 73 : cluster [DBG] 2.2 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 1794048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:30.024632+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 464749 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 1794048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:31.024827+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65306624 unmapped: 1794048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:32.025040+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.101898193s of 10.109881401s, submitted: 4
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65323008 unmapped: 1777664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:33.025242+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 75 sent 73 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:02.333955+0000 osd.0 (osd.0) 74 : cluster [DBG] 5.3 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:02.344481+0000 osd.0 (osd.0) 75 : cluster [DBG] 5.3 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 75)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:02.333955+0000 osd.0 (osd.0) 74 : cluster [DBG] 5.3 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:02.344481+0000 osd.0 (osd.0) 75 : cluster [DBG] 5.3 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65323008 unmapped: 1777664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:34.025438+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65323008 unmapped: 1777664 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:35.025622+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 467160 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65331200 unmapped: 1769472 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:36.025756+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65331200 unmapped: 1769472 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:37.026099+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65339392 unmapped: 1761280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:38.026252+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65339392 unmapped: 1761280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:39.026425+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65339392 unmapped: 1761280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:40.026654+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 77 sent 75 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:09.353971+0000 osd.0 (osd.0) 76 : cluster [DBG] 7.9 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:09.364531+0000 osd.0 (osd.0) 77 : cluster [DBG] 7.9 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 77)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:09.353971+0000 osd.0 (osd.0) 76 : cluster [DBG] 7.9 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:09.364531+0000 osd.0 (osd.0) 77 : cluster [DBG] 7.9 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 469571 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65347584 unmapped: 1753088 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:41.026843+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65347584 unmapped: 1753088 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:42.027061+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65396736 unmapped: 1703936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:43.027254+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 79 sent 77 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:12.302360+0000 osd.0 (osd.0) 78 : cluster [DBG] 5.4 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:12.312892+0000 osd.0 (osd.0) 79 : cluster [DBG] 5.4 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 79)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:12.302360+0000 osd.0 (osd.0) 78 : cluster [DBG] 5.4 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:12.312892+0000 osd.0 (osd.0) 79 : cluster [DBG] 5.4 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65396736 unmapped: 1703936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:44.027505+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.910529137s of 11.991785049s, submitted: 6
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65404928 unmapped: 1695744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:45.027667+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 81 sent 79 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:14.325859+0000 osd.0 (osd.0) 80 : cluster [DBG] 7.18 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:14.336396+0000 osd.0 (osd.0) 81 : cluster [DBG] 7.18 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 81)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:14.325859+0000 osd.0 (osd.0) 80 : cluster [DBG] 7.18 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:14.336396+0000 osd.0 (osd.0) 81 : cluster [DBG] 7.18 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 474395 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65404928 unmapped: 1695744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:46.027840+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65413120 unmapped: 1687552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:47.028040+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.c scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.c scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65421312 unmapped: 1679360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:48.028181+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 83 sent 81 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:17.335608+0000 osd.0 (osd.0) 82 : cluster [DBG] 3.c scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:17.346098+0000 osd.0 (osd.0) 83 : cluster [DBG] 3.c scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 83)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:17.335608+0000 osd.0 (osd.0) 82 : cluster [DBG] 3.c scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:17.346098+0000 osd.0 (osd.0) 83 : cluster [DBG] 3.c scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65421312 unmapped: 1679360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:49.028401+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65421312 unmapped: 1679360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:50.028556+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 476806 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65429504 unmapped: 1671168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:51.028752+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65437696 unmapped: 1662976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:52.028893+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65445888 unmapped: 1654784 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:53.029004+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65462272 unmapped: 1638400 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:54.029195+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 85 sent 83 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:23.406202+0000 osd.0 (osd.0) 84 : cluster [DBG] 5.7 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:23.416740+0000 osd.0 (osd.0) 85 : cluster [DBG] 5.7 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 85)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:23.406202+0000 osd.0 (osd.0) 84 : cluster [DBG] 5.7 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:23.416740+0000 osd.0 (osd.0) 85 : cluster [DBG] 5.7 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65462272 unmapped: 1638400 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:55.029538+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.031082153s of 11.042010307s, submitted: 6
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 481628 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65478656 unmapped: 1622016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:56.029892+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 87 sent 85 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:25.367891+0000 osd.0 (osd.0) 86 : cluster [DBG] 7.4 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:25.378494+0000 osd.0 (osd.0) 87 : cluster [DBG] 7.4 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 87)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:25.367891+0000 osd.0 (osd.0) 86 : cluster [DBG] 7.4 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:25.378494+0000 osd.0 (osd.0) 87 : cluster [DBG] 7.4 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65478656 unmapped: 1622016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:57.030451+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65478656 unmapped: 1622016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:58.030758+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65495040 unmapped: 1605632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:11:59.031052+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65503232 unmapped: 1597440 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:00.031280+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 89 sent 87 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:29.329769+0000 osd.0 (osd.0) 88 : cluster [DBG] 5.2 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:29.340340+0000 osd.0 (osd.0) 89 : cluster [DBG] 5.2 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 89)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:29.329769+0000 osd.0 (osd.0) 88 : cluster [DBG] 5.2 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:29.340340+0000 osd.0 (osd.0) 89 : cluster [DBG] 5.2 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 486452 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65511424 unmapped: 1589248 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:01.031683+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 91 sent 89 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:30.364905+0000 osd.0 (osd.0) 90 : cluster [DBG] 7.1f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:30.375515+0000 osd.0 (osd.0) 91 : cluster [DBG] 7.1f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 91)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:30.364905+0000 osd.0 (osd.0) 90 : cluster [DBG] 7.1f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:30.375515+0000 osd.0 (osd.0) 91 : cluster [DBG] 7.1f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65519616 unmapped: 1581056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:02.032142+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 93 sent 91 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:31.387245+0000 osd.0 (osd.0) 92 : cluster [DBG] 3.f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:31.397761+0000 osd.0 (osd.0) 93 : cluster [DBG] 3.f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 93)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:31.387245+0000 osd.0 (osd.0) 92 : cluster [DBG] 3.f scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:31.397761+0000 osd.0 (osd.0) 93 : cluster [DBG] 3.f scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65519616 unmapped: 1581056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:03.032360+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65527808 unmapped: 1572864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:04.032692+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 95 sent 93 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:33.406595+0000 osd.0 (osd.0) 94 : cluster [DBG] 3.1 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:33.417117+0000 osd.0 (osd.0) 95 : cluster [DBG] 3.1 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 95)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:33.406595+0000 osd.0 (osd.0) 94 : cluster [DBG] 3.1 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:33.417117+0000 osd.0 (osd.0) 95 : cluster [DBG] 3.1 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65536000 unmapped: 1564672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:05.032996+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 491274 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65536000 unmapped: 1564672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:06.033226+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.963118553s of 10.986808777s, submitted: 10
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65536000 unmapped: 1564672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:07.033394+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 97 sent 95 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:36.354698+0000 osd.0 (osd.0) 96 : cluster [DBG] 2.18 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:36.365210+0000 osd.0 (osd.0) 97 : cluster [DBG] 2.18 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 97)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:36.354698+0000 osd.0 (osd.0) 96 : cluster [DBG] 2.18 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:36.365210+0000 osd.0 (osd.0) 97 : cluster [DBG] 2.18 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65552384 unmapped: 1548288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:08.033671+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 99 sent 97 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:37.404891+0000 osd.0 (osd.0) 98 : cluster [DBG] 5.1e scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:37.415457+0000 osd.0 (osd.0) 99 : cluster [DBG] 5.1e scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 99)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:37.404891+0000 osd.0 (osd.0) 98 : cluster [DBG] 5.1e scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:37.415457+0000 osd.0 (osd.0) 99 : cluster [DBG] 5.1e scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65568768 unmapped: 1531904 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:09.033959+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 101 sent 99 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:38.387164+0000 osd.0 (osd.0) 100 : cluster [DBG] 2.19 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:38.397699+0000 osd.0 (osd.0) 101 : cluster [DBG] 2.19 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 101)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:38.387164+0000 osd.0 (osd.0) 100 : cluster [DBG] 2.19 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:38.397699+0000 osd.0 (osd.0) 101 : cluster [DBG] 2.19 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65585152 unmapped: 1515520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:10.034217+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 103 sent 101 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:39.416523+0000 osd.0 (osd.0) 102 : cluster [DBG] 6.3 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:39.434113+0000 osd.0 (osd.0) 103 : cluster [DBG] 6.3 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 103)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:39.416523+0000 osd.0 (osd.0) 102 : cluster [DBG] 6.3 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:39.434113+0000 osd.0 (osd.0) 103 : cluster [DBG] 6.3 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 500924 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65593344 unmapped: 1507328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:11.034471+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65601536 unmapped: 1499136 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:12.034730+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 105 sent 103 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:41.416689+0000 osd.0 (osd.0) 104 : cluster [DBG] 6.0 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:41.441323+0000 osd.0 (osd.0) 105 : cluster [DBG] 6.0 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 105)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:41.416689+0000 osd.0 (osd.0) 104 : cluster [DBG] 6.0 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:41.441323+0000 osd.0 (osd.0) 105 : cluster [DBG] 6.0 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65617920 unmapped: 1482752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:13.035000+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 107 sent 105 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:42.436100+0000 osd.0 (osd.0) 106 : cluster [DBG] 6.7 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:42.450325+0000 osd.0 (osd.0) 107 : cluster [DBG] 6.7 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 107)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:42.436100+0000 osd.0 (osd.0) 106 : cluster [DBG] 6.7 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:42.450325+0000 osd.0 (osd.0) 107 : cluster [DBG] 6.7 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65617920 unmapped: 1482752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:14.035245+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65626112 unmapped: 1474560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:15.035383+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 109 sent 107 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:44.510801+0000 osd.0 (osd.0) 108 : cluster [DBG] 6.9 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:44.521337+0000 osd.0 (osd.0) 109 : cluster [DBG] 6.9 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 109)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:44.510801+0000 osd.0 (osd.0) 108 : cluster [DBG] 6.9 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:44.521337+0000 osd.0 (osd.0) 109 : cluster [DBG] 6.9 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 508157 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65626112 unmapped: 1474560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:16.035638+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65626112 unmapped: 1474560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:17.035785+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.056098938s of 11.086086273s, submitted: 14
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65634304 unmapped: 1466368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:18.035963+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 111 sent 109 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:47.440877+0000 osd.0 (osd.0) 110 : cluster [DBG] 6.5 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:47.458547+0000 osd.0 (osd.0) 111 : cluster [DBG] 6.5 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 111)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:47.440877+0000 osd.0 (osd.0) 110 : cluster [DBG] 6.5 scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:47.458547+0000 osd.0 (osd.0) 111 : cluster [DBG] 6.5 scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65642496 unmapped: 1458176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:19.036205+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.a scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_channel(cluster) log [DBG] : 6.a scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65658880 unmapped: 1441792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:20.036367+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  log_queue is 2 last_log 113 sent 111 num 2 unsent 2 sending 2
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:49.357916+0000 osd.0 (osd.0) 112 : cluster [DBG] 6.a scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  will send 2025-12-03T21:12:49.368546+0000 osd.0 (osd.0) 113 : cluster [DBG] 6.a scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client handle_log_ack log(last 113)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:49.357916+0000 osd.0 (osd.0) 112 : cluster [DBG] 6.a scrub starts
Dec 03 21:43:53 compute-0 ceph-osd[86059]: log_client  logged 2025-12-03T21:12:49.368546+0000 osd.0 (osd.0) 113 : cluster [DBG] 6.a scrub ok
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65658880 unmapped: 1441792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:21.036599+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65658880 unmapped: 1441792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:22.036796+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65667072 unmapped: 1433600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:23.036998+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65667072 unmapped: 1433600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:24.037287+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:25.037554+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65675264 unmapped: 1425408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:26.037897+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65675264 unmapped: 1425408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:27.038106+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65683456 unmapped: 1417216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:28.038646+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65683456 unmapped: 1417216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:29.038895+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65683456 unmapped: 1417216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:30.039150+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 1400832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:31.039519+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 1400832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:32.039826+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65699840 unmapped: 1400832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:33.040044+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65716224 unmapped: 1384448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:34.040320+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65716224 unmapped: 1384448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:35.040752+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65724416 unmapped: 1376256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:36.041133+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65724416 unmapped: 1376256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:37.041398+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65724416 unmapped: 1376256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:38.041633+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65732608 unmapped: 1368064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:39.041956+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65732608 unmapped: 1368064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:40.042255+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 1351680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:41.042700+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65748992 unmapped: 1351680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:42.042992+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65757184 unmapped: 1343488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:43.043223+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65757184 unmapped: 1343488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:44.043744+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65757184 unmapped: 1343488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:45.044134+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65773568 unmapped: 1327104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:46.044336+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65773568 unmapped: 1327104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:47.044624+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65773568 unmapped: 1327104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:48.044831+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65781760 unmapped: 1318912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:49.045340+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65789952 unmapped: 1310720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:50.045749+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65798144 unmapped: 1302528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:51.046036+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65798144 unmapped: 1302528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:52.046268+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65806336 unmapped: 1294336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:53.046510+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65814528 unmapped: 1286144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:54.046749+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65822720 unmapped: 1277952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:55.046923+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 1269760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:56.047077+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 1269760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:57.047269+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65830912 unmapped: 1269760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:58.047443+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 1261568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:12:59.047627+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 1261568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:00.074811+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65839104 unmapped: 1261568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:01.074993+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65847296 unmapped: 1253376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:02.075137+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65855488 unmapped: 1245184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:03.075332+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65863680 unmapped: 1236992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:04.075916+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65863680 unmapped: 1236992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:05.076101+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65863680 unmapped: 1236992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:06.076327+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 1228800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:07.076628+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65871872 unmapped: 1228800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:08.076799+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65880064 unmapped: 1220608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:09.076972+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65880064 unmapped: 1220608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:10.077102+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65880064 unmapped: 1220608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:11.077285+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65888256 unmapped: 1212416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:12.077487+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65888256 unmapped: 1212416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:13.077691+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65896448 unmapped: 1204224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:14.077965+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65904640 unmapped: 1196032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:15.078189+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65904640 unmapped: 1196032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:16.078461+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65912832 unmapped: 1187840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:17.078743+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65912832 unmapped: 1187840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:18.078943+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65912832 unmapped: 1187840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:19.079163+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65921024 unmapped: 1179648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:20.079339+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65921024 unmapped: 1179648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:21.079522+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65937408 unmapped: 1163264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:22.079734+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65937408 unmapped: 1163264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:23.079970+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 1155072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:24.080305+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 1155072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:25.080519+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65945600 unmapped: 1155072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:26.080710+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:27.080946+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:28.081188+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65953792 unmapped: 1146880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:29.081503+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65961984 unmapped: 1138688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:30.081649+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65961984 unmapped: 1138688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:31.081848+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65970176 unmapped: 1130496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:32.082059+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65970176 unmapped: 1130496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:33.082248+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65970176 unmapped: 1130496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:34.082522+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65978368 unmapped: 1122304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:35.082774+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65978368 unmapped: 1122304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:36.082945+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65986560 unmapped: 1114112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:37.083217+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65986560 unmapped: 1114112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:38.083438+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65986560 unmapped: 1114112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:39.083636+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65994752 unmapped: 1105920 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:40.083919+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 65994752 unmapped: 1105920 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:41.084161+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:42.084470+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:43.084786+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66011136 unmapped: 1089536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:44.086013+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 1081344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:45.086154+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 1081344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:46.086324+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66019328 unmapped: 1081344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:47.086528+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66043904 unmapped: 1056768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:48.086726+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66043904 unmapped: 1056768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:49.086936+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66043904 unmapped: 1056768 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:50.087190+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66052096 unmapped: 1048576 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:51.087769+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66060288 unmapped: 1040384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:52.087972+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66060288 unmapped: 1040384 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:53.088188+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66068480 unmapped: 1032192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:54.088614+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66068480 unmapped: 1032192 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:55.089054+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66076672 unmapped: 1024000 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:56.089388+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66076672 unmapped: 1024000 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:57.089656+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:58.089843+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:13:59.090075+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66084864 unmapped: 1015808 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:00.090302+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66093056 unmapped: 1007616 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:01.090652+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66109440 unmapped: 991232 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:02.090870+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66117632 unmapped: 983040 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:03.091037+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66117632 unmapped: 983040 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:04.091510+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66117632 unmapped: 983040 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:05.091617+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:06.091736+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:07.091983+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66125824 unmapped: 974848 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:08.092139+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66134016 unmapped: 966656 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:09.092412+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66134016 unmapped: 966656 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:10.092737+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66142208 unmapped: 958464 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:11.093042+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66142208 unmapped: 958464 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:12.093284+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 950272 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:13.093460+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 950272 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:14.093788+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66150400 unmapped: 950272 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:15.094007+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:16.094294+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:17.094717+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66158592 unmapped: 942080 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:18.095007+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66166784 unmapped: 933888 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:19.095212+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66166784 unmapped: 933888 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:20.095477+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66174976 unmapped: 925696 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:21.095657+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 909312 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:22.095879+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66191360 unmapped: 909312 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:23.096163+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66199552 unmapped: 901120 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:24.096670+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66199552 unmapped: 901120 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:25.096846+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66207744 unmapped: 892928 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:26.097033+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66207744 unmapped: 892928 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:27.097217+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66207744 unmapped: 892928 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:28.097363+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:29.097644+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66215936 unmapped: 884736 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:30.097811+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66224128 unmapped: 876544 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:31.097972+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66224128 unmapped: 876544 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:32.098134+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66224128 unmapped: 876544 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:33.098272+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66232320 unmapped: 868352 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:34.098594+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66232320 unmapped: 868352 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:35.098770+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66240512 unmapped: 860160 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:36.098911+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66240512 unmapped: 860160 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:37.099102+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66240512 unmapped: 860160 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:38.099237+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66248704 unmapped: 851968 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:39.099386+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66248704 unmapped: 851968 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:40.099632+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66273280 unmapped: 827392 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:41.099770+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66273280 unmapped: 827392 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:42.099973+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66273280 unmapped: 827392 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:43.100147+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66273280 unmapped: 827392 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:44.100461+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66281472 unmapped: 819200 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:45.100625+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66281472 unmapped: 819200 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:46.100797+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:47.101024+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:48.101255+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66289664 unmapped: 811008 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:49.101396+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66297856 unmapped: 802816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:50.101618+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66297856 unmapped: 802816 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:51.101772+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66306048 unmapped: 794624 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:52.102533+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66314240 unmapped: 786432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:53.102635+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66314240 unmapped: 786432 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:54.102843+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66322432 unmapped: 778240 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:55.103040+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66322432 unmapped: 778240 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:56.103248+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66330624 unmapped: 770048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:57.103509+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66330624 unmapped: 770048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:58.103734+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66330624 unmapped: 770048 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:14:59.103909+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66338816 unmapped: 761856 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:00.104067+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66355200 unmapped: 745472 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:01.104247+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66363392 unmapped: 737280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:02.104501+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66363392 unmapped: 737280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:03.104656+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66363392 unmapped: 737280 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:04.104815+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66371584 unmapped: 729088 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:05.104955+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66371584 unmapped: 729088 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:06.105097+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66379776 unmapped: 720896 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:07.105243+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66379776 unmapped: 720896 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:08.105375+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66379776 unmapped: 720896 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:09.105501+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:10.105633+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:11.105777+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66387968 unmapped: 712704 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:12.105900+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:13.106044+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:14.106214+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66396160 unmapped: 704512 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:15.106415+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66404352 unmapped: 696320 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:16.106620+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66404352 unmapped: 696320 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:17.106741+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66412544 unmapped: 688128 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:18.106901+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66412544 unmapped: 688128 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:19.107037+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66412544 unmapped: 688128 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:20.107236+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66420736 unmapped: 679936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:21.107375+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66420736 unmapped: 679936 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:22.107650+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66428928 unmapped: 671744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:23.107903+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66428928 unmapped: 671744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:24.108112+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66428928 unmapped: 671744 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:25.108263+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66437120 unmapped: 663552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:26.108419+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66437120 unmapped: 663552 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:27.108614+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66445312 unmapped: 655360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:28.108790+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66445312 unmapped: 655360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:29.108973+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66445312 unmapped: 655360 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:30.109150+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:31.109318+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:32.109445+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66453504 unmapped: 647168 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:33.109702+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:34.109963+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66461696 unmapped: 638976 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:35.110163+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66469888 unmapped: 630784 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:36.110363+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66469888 unmapped: 630784 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:37.110684+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:38.110861+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:39.110995+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:40.111155+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66478080 unmapped: 622592 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:41.111339+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66486272 unmapped: 614400 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:42.111643+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66486272 unmapped: 614400 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:43.111911+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66494464 unmapped: 606208 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:44.112164+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66494464 unmapped: 606208 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:45.112333+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66494464 unmapped: 606208 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:46.112688+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66502656 unmapped: 598016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:47.112970+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66502656 unmapped: 598016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:48.113208+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66502656 unmapped: 598016 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:49.113422+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66510848 unmapped: 589824 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:50.113753+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66510848 unmapped: 589824 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:51.113999+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 581632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:52.114229+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 581632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:53.114533+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66519040 unmapped: 581632 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:54.114759+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66527232 unmapped: 573440 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:55.114966+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66527232 unmapped: 573440 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:56.115208+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66535424 unmapped: 565248 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:57.115402+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66535424 unmapped: 565248 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:58.115531+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:15:59.115715+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:00.115901+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66543616 unmapped: 557056 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:01.116108+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66551808 unmapped: 548864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:02.116282+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66551808 unmapped: 548864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:03.116449+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66551808 unmapped: 548864 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:04.116668+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66560000 unmapped: 540672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:05.116820+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66560000 unmapped: 540672 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:06.117085+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:07.117276+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:08.117489+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66568192 unmapped: 532480 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:09.117681+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66576384 unmapped: 524288 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:10.117861+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66584576 unmapped: 516096 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:11.118047+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66592768 unmapped: 507904 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:12.118179+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66592768 unmapped: 507904 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:13.118325+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66592768 unmapped: 507904 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:14.118503+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66600960 unmapped: 499712 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:15.118704+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66600960 unmapped: 499712 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:16.118890+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66600960 unmapped: 499712 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:17.119041+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:18.119178+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:19.119325+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:20.119472+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66609152 unmapped: 491520 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:21.119708+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:22.119871+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:23.120108+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66617344 unmapped: 483328 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:24.120341+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66625536 unmapped: 475136 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:25.120629+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66625536 unmapped: 475136 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:26.120887+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:27.121207+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:28.121476+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66633728 unmapped: 466944 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:29.121651+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:30.121837+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:31.122010+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66641920 unmapped: 458752 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:32.122167+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:33.122298+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:34.122547+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66650112 unmapped: 450560 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:35.122670+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:36.122816+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66658304 unmapped: 442368 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:37.122984+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66666496 unmapped: 434176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:38.123121+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66666496 unmapped: 434176 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:39.123341+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:40.123599+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:41.123863+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66674688 unmapped: 425984 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:42.124082+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:43.124363+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:44.124652+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66682880 unmapped: 417792 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:45.124875+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:46.125000+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:47.125182+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66691072 unmapped: 409600 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:48.125396+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:49.125604+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66699264 unmapped: 401408 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:50.125758+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:51.125959+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66707456 unmapped: 393216 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:52.126083+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:53.126158+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:54.126302+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66715648 unmapped: 385024 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:55.126468+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:56.126638+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:57.126810+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66723840 unmapped: 376832 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:58.126997+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15144 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:16:59.127161+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66732032 unmapped: 368640 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:00.127345+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66740224 unmapped: 360448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:01.127465+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66740224 unmapped: 360448 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:02.127629+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66748416 unmapped: 352256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:03.127768+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66748416 unmapped: 352256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:04.127939+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66748416 unmapped: 352256 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:05.128152+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:06.128387+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:07.128719+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66756608 unmapped: 344064 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:08.128901+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66764800 unmapped: 335872 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:09.129091+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66764800 unmapped: 335872 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:10.129251+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66772992 unmapped: 327680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:11.129403+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66772992 unmapped: 327680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:12.129558+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66772992 unmapped: 327680 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:13.130078+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:14.130222+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66781184 unmapped: 319488 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:15.130375+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:16.130521+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:17.130618+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66789376 unmapped: 311296 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:18.130733+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66797568 unmapped: 303104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:19.130857+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66797568 unmapped: 303104 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:20.131005+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66805760 unmapped: 294912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:21.131144+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66805760 unmapped: 294912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:22.131265+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66805760 unmapped: 294912 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:23.131417+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:24.131633+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66813952 unmapped: 286720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:25.131812+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66813952 unmapped: 286720 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:26.131978+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:27.132097+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:28.132249+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66822144 unmapped: 278528 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:29.132426+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:30.132608+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66830336 unmapped: 270336 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:31.132767+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:32.132968+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:33.133130+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66838528 unmapped: 262144 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:34.133520+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:35.133672+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:36.133813+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66846720 unmapped: 253952 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:37.134041+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66854912 unmapped: 245760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:38.134204+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66854912 unmapped: 245760 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:39.134352+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66863104 unmapped: 237568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:40.134512+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66863104 unmapped: 237568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:41.134675+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66863104 unmapped: 237568 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:42.134836+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:43.135034+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66871296 unmapped: 229376 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:44.135218+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:45.135451+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66879488 unmapped: 221184 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:46.135669+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:47.135823+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:48.136000+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66887680 unmapped: 212992 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:49.136205+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:50.136534+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:51.136703+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66895872 unmapped: 204800 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:52.136910+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66904064 unmapped: 196608 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:53.137054+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:54.137376+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:55.137614+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66912256 unmapped: 188416 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:56.137803+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66920448 unmapped: 180224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:57.138045+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66920448 unmapped: 180224 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:58.138327+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:17:59.138502+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:00.138608+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:01.138750+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:02.138937+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66928640 unmapped: 172032 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:03.139267+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66936832 unmapped: 163840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:04.139415+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66936832 unmapped: 163840 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:05.139554+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66945024 unmapped: 155648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:06.140751+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66945024 unmapped: 155648 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:07.140938+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:08.141226+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66953216 unmapped: 147456 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:09.141479+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:10.141686+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:11.141901+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66961408 unmapped: 139264 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:12.142102+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:13.142277+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:14.142512+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66969600 unmapped: 131072 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:15.142711+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 122880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:16.142878+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66977792 unmapped: 122880 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:17.143067+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:18.143230+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:19.143421+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66985984 unmapped: 114688 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:20.143598+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:21.143745+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:22.143885+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 66994176 unmapped: 106496 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:23.144035+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67002368 unmapped: 98304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:24.144212+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67002368 unmapped: 98304 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:25.144330+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67010560 unmapped: 90112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:26.144493+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67010560 unmapped: 90112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:27.144675+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67010560 unmapped: 90112 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:28.144818+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67018752 unmapped: 81920 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:29.144957+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67018752 unmapped: 81920 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:30.145093+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67026944 unmapped: 73728 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:31.145234+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67026944 unmapped: 73728 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:32.145387+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67026944 unmapped: 73728 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:33.145547+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:34.145742+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:35.145876+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:36.146033+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67043328 unmapped: 57344 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:37.146209+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:38.146368+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:39.146503+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67035136 unmapped: 65536 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 16.41 MB, 0.03 MB/s
                                           Interval WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:40.146654+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:41.146816+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:42.146998+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67100672 unmapped: 0 heap: 67100672 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:43.147135+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:44.147312+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67108864 unmapped: 1040384 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:45.147502+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:46.147668+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:47.147816+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67117056 unmapped: 1032192 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:48.147980+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67125248 unmapped: 1024000 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:49.148156+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 1015808 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:50.148382+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67133440 unmapped: 1015808 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:51.148614+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67141632 unmapped: 1007616 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:52.148801+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67141632 unmapped: 1007616 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:53.148957+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67149824 unmapped: 999424 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:54.149307+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67149824 unmapped: 999424 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:55.149552+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67149824 unmapped: 999424 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:56.149804+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:57.149975+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:58.150229+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67158016 unmapped: 991232 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:18:59.150448+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:00.150647+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67166208 unmapped: 983040 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:01.150802+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:02.150939+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:03.151097+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67174400 unmapped: 974848 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:04.151324+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:05.151457+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67182592 unmapped: 966656 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:06.151661+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:07.151809+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:08.152059+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67190784 unmapped: 958464 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:09.152332+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:10.152533+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67198976 unmapped: 950272 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:11.152703+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:12.152902+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:13.153122+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67207168 unmapped: 942080 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:14.153285+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:15.153427+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67215360 unmapped: 933888 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:16.153619+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67223552 unmapped: 925696 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:17.153754+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:18.153943+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:19.154122+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67231744 unmapped: 917504 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:20.154276+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67239936 unmapped: 909312 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:21.154441+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67239936 unmapped: 909312 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:22.154593+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:23.154740+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:24.154958+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67248128 unmapped: 901120 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:25.155115+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67256320 unmapped: 892928 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:26.155309+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67256320 unmapped: 892928 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:27.155442+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:28.155534+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67264512 unmapped: 884736 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:29.155639+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:30.155781+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:31.155902+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67272704 unmapped: 876544 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:32.156041+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:33.156182+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:34.156334+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:35.156472+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67280896 unmapped: 868352 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:36.156680+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67289088 unmapped: 860160 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:37.156761+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:38.156877+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:39.157110+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67297280 unmapped: 851968 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:40.157284+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:41.157468+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:42.157609+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67305472 unmapped: 843776 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:43.157790+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 835584 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:44.158059+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67313664 unmapped: 835584 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:45.158487+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:46.158697+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:47.158906+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67321856 unmapped: 827392 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:48.159106+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67330048 unmapped: 819200 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:49.159286+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67330048 unmapped: 819200 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:50.159422+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67338240 unmapped: 811008 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:51.159554+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67338240 unmapped: 811008 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:52.159846+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67346432 unmapped: 802816 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:53.160034+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 794624 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:54.160279+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67354624 unmapped: 794624 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:55.160437+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:56.160653+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:57.160805+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67362816 unmapped: 786432 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:58.160947+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:19:59.161126+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:00.161288+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67371008 unmapped: 778240 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:01.161462+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:02.161630+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:03.161806+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67379200 unmapped: 770048 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:04.162026+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67387392 unmapped: 761856 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:05.162212+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67387392 unmapped: 761856 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:06.162387+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:07.162616+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:08.162786+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67395584 unmapped: 753664 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:09.162926+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67403776 unmapped: 745472 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:10.163075+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67403776 unmapped: 745472 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:11.163207+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:12.163419+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:13.163605+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:14.163802+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:15.163936+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:16.164135+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67411968 unmapped: 737280 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:17.164317+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:18.164499+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67420160 unmapped: 729088 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:19.164693+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:20.164869+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:21.165071+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67428352 unmapped: 720896 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:22.165298+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:23.165480+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67436544 unmapped: 712704 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:24.165678+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:25.165822+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:26.165967+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67444736 unmapped: 704512 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:27.166139+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:28.166370+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67452928 unmapped: 696320 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:29.166504+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:30.166681+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:31.166807+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67461120 unmapped: 688128 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:32.166960+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67469312 unmapped: 679936 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:33.167252+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67469312 unmapped: 679936 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:34.167456+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67477504 unmapped: 671744 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:35.167639+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67477504 unmapped: 671744 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:36.167848+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:37.167994+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:38.168111+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67485696 unmapped: 663552 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:39.168280+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:40.168432+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:41.168640+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:42.168812+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:43.184841+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:44.185011+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:45.185157+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:46.185351+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:47.185638+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:48.185858+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:49.186075+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:50.186399+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:51.186534+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:52.186679+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:53.186830+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:54.187040+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:55.187180+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:56.187389+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:57.187526+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:58.187752+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:20:59.187998+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:00.188126+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:01.188861+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:02.189002+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:03.189146+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:04.189325+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:05.189473+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:06.189631+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:07.189769+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:08.189887+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:09.190193+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:10.190321+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:11.190465+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:12.190625+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:13.190808+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:14.191005+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:15.191339+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:16.191521+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:17.191657+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:18.191841+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:19.192074+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:20.192255+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:21.192461+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:22.192638+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:23.192821+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:24.192956+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:25.193164+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:26.193374+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:27.193639+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:28.193804+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:29.193960+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:30.194130+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:31.194330+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:32.194512+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:33.194673+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:34.194933+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:35.195089+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67493888 unmapped: 655360 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:36.195232+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:37.195375+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:38.195610+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:39.195790+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:40.195990+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:41.196164+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:42.196380+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:43.196647+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:44.196859+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:45.197013+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:46.197189+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:47.197359+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:48.197507+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:49.197753+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:50.197918+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:51.198066+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:52.198208+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:53.198373+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:54.198788+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:55.198941+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:56.199081+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:57.199233+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:58.199380+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:21:59.199526+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:00.199719+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:01.199901+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:02.200058+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:03.200235+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:04.200413+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:05.200636+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:06.200776+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:07.200929+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:08.201066+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:09.201219+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:10.201350+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:11.201494+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:12.201657+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:13.201798+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:14.202002+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:15.202163+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:16.202343+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:17.202467+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:18.202678+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:19.202839+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:20.203012+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:21.203168+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:22.203294+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:23.203479+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:24.203635+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:25.203796+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:26.204012+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:27.204201+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:28.204403+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:29.204612+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:30.204780+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:31.205157+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:32.205681+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:33.206078+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:34.206361+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:35.206502+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:36.207910+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:37.208060+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:38.208191+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:39.208513+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:40.208646+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:41.208812+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:42.208984+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:43.209388+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:44.209618+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:45.209787+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:46.209917+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:47.210259+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:48.210466+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:49.210637+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:50.210799+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:51.210982+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:52.211192+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:53.211378+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:54.211604+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:55.211786+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:56.212188+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:57.212327+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:58.212486+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:22:59.212662+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:00.212818+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:01.212994+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:02.213267+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:03.213421+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:04.213623+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:05.213799+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:06.214003+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:07.214165+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:08.214325+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:09.214664+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:10.214855+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:11.215066+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:12.215228+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:13.215372+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:14.215623+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:15.215778+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:16.215943+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:17.216272+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:18.216446+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:19.216685+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:20.216916+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:21.217088+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:22.217299+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:23.217545+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:24.217756+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:25.217937+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:26.218100+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:27.218302+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:28.218430+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:29.218559+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:30.218744+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:31.218888+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:32.219080+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:33.219264+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:34.219498+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:35.219663+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:36.219971+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:37.220147+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:38.220301+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:39.220505+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:40.220748+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:41.220922+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:42.221104+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:43.221270+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:44.221455+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67502080 unmapped: 647168 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:45.221616+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: mgrc ms_handle_reset ms_handle_reset con 0x56144a546000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1553116858
Dec 03 21:43:53 compute-0 ceph-osd[86059]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1553116858,v1:192.168.122.100:6801/1553116858]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: get_auth_request con 0x561449e0a800 auth_method 0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: mgrc handle_mgr_configure stats_period=5
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:46.221767+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:47.221909+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:48.222044+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:49.222255+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:50.222496+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67829760 unmapped: 319488 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 ms_handle_reset con 0x561449e0b400 session 0x56144a79a700
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e32c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:51.222665+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:52.222836+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:53.223041+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:54.223288+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:55.223473+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:56.223629+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:57.223754+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:58.223915+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:23:59.224091+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:00.224242+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:01.224397+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:02.224553+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:03.224716+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:04.224884+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:05.225021+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:06.225368+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:07.225512+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:08.225731+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:09.226147+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:10.226314+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:11.226483+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:12.226695+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:13.226984+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:14.227178+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:15.227429+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:16.227565+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:17.227737+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:18.227868+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:19.227983+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:20.228195+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:21.228379+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:22.228557+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:23.228982+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:24.229216+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:25.229435+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:26.229598+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:27.229699+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:28.229864+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:29.229994+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:30.230160+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:31.230340+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:32.230517+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:33.230752+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:34.230945+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:35.231120+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:36.231271+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:37.231502+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:38.231643+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:39.231855+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:40.232283+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:41.232528+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:42.232856+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:43.233076+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:44.233415+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:45.233618+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:46.233834+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:47.234049+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:48.234393+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:49.234616+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:50.235060+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:51.235271+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:52.235527+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:53.235875+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:54.236116+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:55.236255+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:56.236381+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:57.236494+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:58.236646+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:24:59.236782+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:00.236913+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:01.237101+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:02.237282+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:03.237450+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:04.237762+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:05.237915+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:06.238131+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:07.238319+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:08.238533+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:09.238692+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:10.238833+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:11.238988+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:12.239130+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:13.239280+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:14.239517+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:15.239692+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:16.239853+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:17.240016+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:18.240192+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:19.240336+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:20.240489+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:21.240677+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:22.240833+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:23.240987+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:24.241236+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:25.241453+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:26.241677+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:27.241822+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:28.241954+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:29.242126+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:30.242339+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:31.242509+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:32.242736+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:33.242871+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:34.243074+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:35.243240+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:36.243445+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:37.243631+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:38.243778+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:39.243952+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:40.244191+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:41.244394+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:42.244620+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:43.244807+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:44.244986+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:45.245192+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:46.245345+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:47.245608+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:48.245833+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:49.246130+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:50.246396+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:51.246687+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:52.246975+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:53.247236+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:54.247414+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:55.247541+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:56.247675+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:57.247807+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:58.247989+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:25:59.248112+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:00.248256+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:01.248439+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:02.248665+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:03.248789+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:04.248936+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:05.249137+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:06.249312+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:07.249481+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:08.249636+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:09.249809+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:10.250007+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:11.250142+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:12.250292+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:13.250478+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:14.250693+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:15.250880+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:16.251049+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:17.251190+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:18.251439+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:19.251599+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:20.251726+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:21.251854+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:22.252020+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:23.252197+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:24.252359+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:25.252489+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:26.252620+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:27.252772+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:28.252923+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:29.253064+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:30.253258+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:31.253417+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:32.253625+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:33.253781+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:34.254028+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:35.254195+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:36.254369+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:37.254632+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:38.254828+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:39.255056+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:40.255219+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:41.255346+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:42.257663+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:43.257832+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:44.258001+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:45.258230+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:46.258446+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:47.258628+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:48.258766+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:49.258955+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:50.259120+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:51.259345+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:52.259546+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:53.259791+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:54.259998+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:55.260188+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:56.260386+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:57.260606+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:58.260844+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:26:59.261033+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:00.261236+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:01.261398+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:02.261549+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:03.261745+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:04.261963+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:05.262187+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:06.262349+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:07.262560+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:08.262804+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:09.262988+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:10.263164+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:11.263420+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:12.263670+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:13.263917+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:14.264120+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:15.264257+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:16.264407+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:17.264556+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:18.264733+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:19.264958+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:20.265160+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:21.265391+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:22.265675+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:23.265946+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:24.271630+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:25.271915+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:26.272135+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:27.272330+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:28.272545+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:29.272711+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:30.272921+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:31.273257+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:32.273496+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:33.273677+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:34.273930+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:35.274110+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:36.274296+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:37.274557+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:38.274788+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:39.275015+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:40.275248+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:41.275516+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:42.275685+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:43.275817+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:44.276079+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:45.276307+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:46.276461+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:47.276603+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:48.276786+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:49.276974+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:50.277119+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:51.277336+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:52.277531+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:53.277669+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:54.277836+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:55.278012+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:56.278163+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:57.278316+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:58.278533+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:27:59.278754+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:00.278954+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:01.279121+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:02.279245+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:03.279424+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:04.279645+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:05.279857+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:06.280093+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:07.280329+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:08.280547+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:09.280713+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:10.280867+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:11.281131+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67813376 unmapped: 335872 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:12.281310+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:13.281488+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:14.281714+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:15.281901+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:16.282844+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:17.283285+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:18.283516+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:19.283678+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:20.283882+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:21.284050+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:22.284237+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:23.284430+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:24.284725+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:25.284899+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:26.285062+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:27.285214+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:28.285365+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:29.285550+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:30.285761+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:31.285936+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:32.286181+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:33.286345+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:34.286553+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:35.286740+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:36.286896+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:37.287087+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:38.287255+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67821568 unmapped: 327680 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:39.287452+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 4373 writes, 451 syncs, 9.70 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.02              0.00         1    0.017       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.5      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c39a30#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x561448c398d0#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:40.287681+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:41.287860+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:42.288064+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:43.288204+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:44.288415+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:45.288546+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:46.288765+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:47.288912+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:48.289075+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:49.289237+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:50.289402+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67854336 unmapped: 294912 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:51.289614+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:52.289792+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:53.290010+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:54.290241+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:55.290450+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:56.291079+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:57.291411+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:58.291544+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:28:59.291851+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:00.292274+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:01.292724+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:02.293040+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:03.293544+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:04.294028+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:05.294272+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:06.294455+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:07.294789+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:08.295095+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:09.295411+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:10.295698+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:11.295983+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:12.296251+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:13.296506+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:14.296872+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:15.297088+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:16.297317+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:17.297532+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:18.297759+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:19.297937+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:20.298127+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:21.298300+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:22.298500+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:23.298656+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:24.298916+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:25.299183+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:26.299374+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:27.299561+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:28.299804+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:29.300274+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:30.300635+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:31.300927+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:32.301043+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:33.301180+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:34.301353+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:35.301655+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:36.301805+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:37.302043+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:38.302219+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:39.302504+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:40.302707+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:41.303011+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:42.303268+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:43.303400+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:44.303675+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:45.303927+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:46.304284+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:47.304554+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:48.304760+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:49.304990+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:50.305238+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:51.305418+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:52.305634+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:53.305845+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:54.306095+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:55.306311+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:56.306536+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:57.306782+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:58.307132+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:29:59.307391+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:00.307643+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:01.307872+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:02.308044+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:03.308185+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:04.308519+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:05.308762+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:06.308956+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:07.309393+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:08.309853+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:09.310265+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:10.310531+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:11.310835+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:12.311159+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:13.311435+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:14.311718+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:15.311957+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:16.312199+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:17.312421+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:18.312645+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67862528 unmapped: 286720 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:19.312789+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 155648 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 512979 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 heartbeat osd_stat(store_statfs(0x4fe120000/0x0/0x4ffc00000, data 0x47827/0xac000, compress 0x0/0x0/0x0, omap 0xcf2a, meta 0x1a230d6), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 handle_osd_map epochs [63,63], i have 62, src has [1,63]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 62 handle_osd_map epochs [63,63], i have 63, src has [1,63]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 1082.728759766s of 1082.736083984s, submitted: 4
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:20.312958+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67993600 unmapped: 155648 heap: 68149248 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 63 handle_osd_map epochs [63,64], i have 63, src has [1,64]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:21.313090+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 9568256 heap: 77463552 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 64 ms_handle_reset con 0x561449e33000 session 0x56144ca58fc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:22.313289+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67895296 unmapped: 9568256 heap: 77463552 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 64 heartbeat osd_stat(store_statfs(0x4fdca8000/0x0/0x4ffc00000, data 0x4ba415/0x522000, compress 0x0/0x0/0x0, omap 0xd454, meta 0x1a22bac), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 64 handle_osd_map epochs [65,65], i have 64, src has [1,65]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 64 handle_osd_map epochs [65,65], i have 65, src has [1,65]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:23.313462+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 9551872 heap: 77463552 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:24.313665+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 67911680 unmapped: 9551872 heap: 77463552 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc69000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 65 ms_handle_reset con 0x56144cc69000 session 0x56144a2c8540
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 586543 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:25.313858+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 65 heartbeat osd_stat(store_statfs(0x4fd4a7000/0x0/0x4ffc00000, data 0xcbb9fe/0xd25000, compress 0x0/0x0/0x0, omap 0xd6ef, meta 0x1a22911), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 65 handle_osd_map epochs [66,66], i have 65, src has [1,66]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 65 handle_osd_map epochs [66,66], i have 66, src has [1,66]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:26.314017+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:27.314212+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:28.314400+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:29.314640+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 590035 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:30.314838+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 66 heartbeat osd_stat(store_statfs(0x4fd4a2000/0x0/0x4ffc00000, data 0xcbcfe7/0xd28000, compress 0x0/0x0/0x0, omap 0xd98e, meta 0x1a22672), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:31.315080+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:32.315326+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:33.315685+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:34.315992+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 590035 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:35.316211+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68173824 unmapped: 17686528 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 66 handle_osd_map epochs [67,67], i have 66, src has [1,67]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.200830460s of 15.669299126s, submitted: 15
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:36.316419+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:37.316623+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:38.316773+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:39.316917+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:40.317075+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:41.317248+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:42.317442+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:43.317628+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:44.317904+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:45.318134+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:46.318435+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:47.318638+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:48.318951+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:49.319205+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:50.319407+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:51.319697+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:52.319872+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:53.320176+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:54.320384+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:55.320611+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:56.320796+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:57.320950+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:58.321164+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:30:59.321399+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68190208 unmapped: 17670144 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 592807 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:00.321712+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 17661952 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:01.322028+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68198400 unmapped: 17661952 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:02.322143+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68329472 unmapped: 17530880 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 67 handle_osd_map epochs [68,68], i have 67, src has [1,68]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.018529892s of 27.025806427s, submitted: 9
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 68 ms_handle_reset con 0x56144cc68c00 session 0x56144a2c8c40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:03.322304+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 68 heartbeat osd_stat(store_statfs(0x4fd49f000/0x0/0x4ffc00000, data 0xcbe497/0xd2b000, compress 0x0/0x0/0x0, omap 0xd9de, meta 0x1a22622), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 17506304 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:04.322517+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68354048 unmapped: 17506304 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 597319 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:05.322658+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68493312 unmapped: 17367040 heap: 85860352 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 68 heartbeat osd_stat(store_statfs(0x4fd49a000/0x0/0x4ffc00000, data 0xcbfa97/0xd30000, compress 0x0/0x0/0x0, omap 0xdab4, meta 0x1a2254c), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:06.322790+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 77086720 unmapped: 17170432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:07.322924+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 25452544 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 68 handle_osd_map epochs [68,69], i have 68, src has [1,69]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 69 ms_handle_reset con 0x56144cc66000 session 0x56144b4061c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:08.323088+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68804608 unmapped: 25452544 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 69 handle_osd_map epochs [69,70], i have 69, src has [1,70]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 70 ms_handle_reset con 0x56144cc66400 session 0x56144a30efc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:09.323269+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68763648 unmapped: 25493504 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 70 handle_osd_map epochs [71,71], i have 70, src has [1,71]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 71 ms_handle_reset con 0x561449e33000 session 0x56144cab6000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 71 ms_handle_reset con 0x56144cc66000 session 0x56144ca01a40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:10.323528+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 617753 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 68845568 unmapped: 25411584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 71 ms_handle_reset con 0x56144cc68c00 session 0x56144c8aefc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 71 ms_handle_reset con 0x56144cc66800 session 0x56144c82c000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fd490000/0x0/0x4ffc00000, data 0xcc4073/0xd38000, compress 0x0/0x0/0x0, omap 0xd42d, meta 0x1a22bd3), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:11.323713+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 25255936 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144c86ec00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:12.323906+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 71 heartbeat osd_stat(store_statfs(0x4fd490000/0x0/0x4ffc00000, data 0xcc4073/0xd38000, compress 0x0/0x0/0x0, omap 0xd42d, meta 0x1a22bd3), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69148672 unmapped: 25108480 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 71 handle_osd_map epochs [71,72], i have 71, src has [1,72]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.791918755s of 10.061242104s, submitted: 64
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 72 ms_handle_reset con 0x56144c86ec00 session 0x56144a30fa40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:13.324101+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 72 ms_handle_reset con 0x561449e33000 session 0x56144a2c8000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 72 ms_handle_reset con 0x56144cc66800 session 0x56144b55c540
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 72 heartbeat osd_stat(store_statfs(0x4fd493000/0x0/0x4ffc00000, data 0xcc4096/0xd39000, compress 0x0/0x0/0x0, omap 0xd42d, meta 0x1a22bd3), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69017600 unmapped: 25239552 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:14.324298+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69025792 unmapped: 25231360 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 72 handle_osd_map epochs [73,73], i have 72, src has [1,73]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 73 ms_handle_reset con 0x56144cc68c00 session 0x56144ca008c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:15.324427+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 623846 data_alloc: 218103808 data_used: 658
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf5400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf5000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69378048 unmapped: 24879104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 73 handle_osd_map epochs [74,74], i have 73, src has [1,74]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 74 ms_handle_reset con 0x56144dcf5000 session 0x56144a79b340
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 74 ms_handle_reset con 0x56144dcf5400 session 0x56144c82c8c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:16.324632+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 74 handle_osd_map epochs [75,75], i have 74, src has [1,75]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 24764416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 75 heartbeat osd_stat(store_statfs(0x4fd482000/0x0/0x4ffc00000, data 0xcc939a/0xd43000, compress 0x0/0x0/0x0, omap 0xce07, meta 0x1a231f9), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:17.324861+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 24764416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:18.325079+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69525504 unmapped: 24731648 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:19.325267+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69525504 unmapped: 24731648 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:20.325454+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 630921 data_alloc: 218103808 data_used: 4719
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69533696 unmapped: 24723456 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 75 handle_osd_map epochs [75,76], i have 75, src has [1,76]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 76 ms_handle_reset con 0x561449e33000 session 0x56144c82d880
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 76 heartbeat osd_stat(store_statfs(0x4fd488000/0x0/0x4ffc00000, data 0xcc93aa/0xd44000, compress 0x0/0x0/0x0, omap 0xcfee, meta 0x1a23012), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:21.325632+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 24657920 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:22.325803+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 76 heartbeat osd_stat(store_statfs(0x4fd482000/0x0/0x4ffc00000, data 0xcca9b6/0xd48000, compress 0x0/0x0/0x0, omap 0xd1d8, meta 0x1a22e28), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 24657920 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 76 handle_osd_map epochs [77,77], i have 76, src has [1,77]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.904905319s of 10.081530571s, submitted: 99
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 77 ms_handle_reset con 0x56144b455c00 session 0x56144b407500
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:23.325996+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69632000 unmapped: 24625152 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 77 handle_osd_map epochs [78,78], i have 77, src has [1,78]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:24.326215+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69632000 unmapped: 24625152 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 78 handle_osd_map epochs [78,79], i have 78, src has [1,79]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 79 ms_handle_reset con 0x56144b455800 session 0x56144a8228c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:25.326341+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 651584 data_alloc: 218103808 data_used: 4719
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69697536 unmapped: 24559616 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 79 handle_osd_map epochs [79,80], i have 79, src has [1,80]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 80 ms_handle_reset con 0x56144b455400 session 0x56144a823c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 80 handle_osd_map epochs [80,81], i have 80, src has [1,81]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:26.326471+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69844992 unmapped: 24412160 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 81 handle_osd_map epochs [81,82], i have 81, src has [1,82]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 82 ms_handle_reset con 0x56144cc66000 session 0x56144a79b500
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:27.326643+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 24387584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:28.326785+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 82 heartbeat osd_stat(store_statfs(0x4fd46a000/0x0/0x4ffc00000, data 0xcd30b3/0xd60000, compress 0x0/0x0/0x0, omap 0x11814, meta 0x1a1e7ec), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 24387584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:29.326952+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 24387584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:30.327118+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 665045 data_alloc: 218103808 data_used: 4719
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 69861376 unmapped: 24395776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 82 handle_osd_map epochs [83,83], i have 82, src has [1,83]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 83 ms_handle_reset con 0x561449e33000 session 0x56144c82ca80
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:31.327252+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 23314432 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 83 handle_osd_map epochs [83,84], i have 83, src has [1,84]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 84 ms_handle_reset con 0x56144b455400 session 0x56144a5e4700
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:32.327392+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 23232512 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 84 handle_osd_map epochs [85,85], i have 84, src has [1,85]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.928540230s of 10.082557678s, submitted: 92
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 85 ms_handle_reset con 0x56144b455800 session 0x56144a5e4a80
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:33.327556+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 23085056 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 85 heartbeat osd_stat(store_statfs(0x4fd466000/0x0/0x4ffc00000, data 0xcd6e5d/0xd64000, compress 0x0/0x0/0x0, omap 0x10e07, meta 0x1a1f1f9), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:34.327822+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 86 ms_handle_reset con 0x56144b455c00 session 0x56144c8aee00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 23027712 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 86 handle_osd_map epochs [86,87], i have 86, src has [1,87]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:35.327949+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 87 ms_handle_reset con 0x56144cc68c00 session 0x56144c82c380
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 675307 data_alloc: 218103808 data_used: 8780
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 23044096 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 88 ms_handle_reset con 0x561449e33000 session 0x56144a5e5dc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:36.328051+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:37.328237+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:38.328446+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 88 heartbeat osd_stat(store_statfs(0x4fc2c0000/0x0/0x4ffc00000, data 0xcdb06a/0xd6a000, compress 0x0/0x0/0x0, omap 0x10ad6, meta 0x2bbf52a), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:39.328692+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:40.328897+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 675328 data_alloc: 218103808 data_used: 8780
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:41.329087+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 22863872 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:42.329310+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 22839296 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:43.329442+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 22839296 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:44.329686+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 89 heartbeat osd_stat(store_statfs(0x4fc2bd000/0x0/0x4ffc00000, data 0xcdc536/0xd6d000, compress 0x0/0x0/0x0, omap 0x10c0d, meta 0x2bbf3f3), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 22839296 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:45.329880+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.114182472s of 12.301213264s, submitted: 110
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 679237 data_alloc: 218103808 data_used: 8780
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 22765568 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 89 handle_osd_map epochs [89,90], i have 89, src has [1,90]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 90 ms_handle_reset con 0x56144b455c00 session 0x56144ca01880
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:46.330034+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:47.330234+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:48.330370+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 90 heartbeat osd_stat(store_statfs(0x4fc2b9000/0x0/0x4ffc00000, data 0xcddb5e/0xd71000, compress 0x0/0x0/0x0, omap 0x10d90, meta 0x2bbf270), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:49.330536+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:50.330660+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 682673 data_alloc: 218103808 data_used: 8780
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 22749184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:51.330815+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 22732800 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:52.330926+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 92 ms_handle_reset con 0x56144cc66800 session 0x56144a5e5500
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf5000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 92 ms_handle_reset con 0x56144dcf5000 session 0x56144a823c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 22716416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:53.331107+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 22716416 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:54.331288+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 92 heartbeat osd_stat(store_statfs(0x4fc2b0000/0x0/0x4ffc00000, data 0xce075c/0xd78000, compress 0x0/0x0/0x0, omap 0x10f91, meta 0x2bbf06f), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:55.331465+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 690209 data_alloc: 218103808 data_used: 8780
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:56.331640+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:57.331810+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:58.332039+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:31:59.332220+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 92 heartbeat osd_stat(store_statfs(0x4fc2b0000/0x0/0x4ffc00000, data 0xce075c/0xd78000, compress 0x0/0x0/0x0, omap 0x10f91, meta 0x2bbf06f), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 22691840 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:00.332404+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 690209 data_alloc: 218103808 data_used: 8780
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 92 heartbeat osd_stat(store_statfs(0x4fc2b0000/0x0/0x4ffc00000, data 0xce075c/0xd78000, compress 0x0/0x0/0x0, omap 0x10f91, meta 0x2bbf06f), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf4c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 92 ms_handle_reset con 0x56144dcf4c00 session 0x56144a863340
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 22953984 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:01.332622+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.329170227s of 16.372501373s, submitted: 33
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 93 ms_handle_reset con 0x561449e33000 session 0x56144a822380
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 22953984 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:02.332848+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 93 ms_handle_reset con 0x56144cc72c00 session 0x56144a8228c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 22953984 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 93 handle_osd_map epochs [93,94], i have 93, src has [1,94]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:03.333041+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 23142400 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 94 handle_osd_map epochs [94,95], i have 94, src has [1,95]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 95 ms_handle_reset con 0x56144cbc4400 session 0x56144a823a40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:04.333240+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449a3b400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144c86ec00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 21807104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:05.333483+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 701785 data_alloc: 218103808 data_used: 8796
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 21807104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:06.333646+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 95 heartbeat osd_stat(store_statfs(0x4fc2a6000/0x0/0x4ffc00000, data 0xce480a/0xd82000, compress 0x0/0x0/0x0, omap 0x11421, meta 0x2bbebdf), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72450048 unmapped: 21807104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:07.333856+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72622080 unmapped: 21635072 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 96 ms_handle_reset con 0x56144cc79000 session 0x56144ca01500
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:08.334060+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144a818800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72777728 unmapped: 21479424 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 96 heartbeat osd_stat(store_statfs(0x4fc2a3000/0x0/0x4ffc00000, data 0xce5e53/0xd87000, compress 0x0/0x0/0x0, omap 0x114cc, meta 0x2bbeb34), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 97 ms_handle_reset con 0x56144a818800 session 0x56144b406380
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:09.335330+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 97 heartbeat osd_stat(store_statfs(0x4fc2a0000/0x0/0x4ffc00000, data 0xce743c/0xd8a000, compress 0x0/0x0/0x0, omap 0x11577, meta 0x2bbea89), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72794112 unmapped: 21463040 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 97 ms_handle_reset con 0x561449e33000 session 0x56144ca65a40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:10.335687+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 710022 data_alloc: 218103808 data_used: 8831
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 97 ms_handle_reset con 0x56144cbc4400 session 0x56144ca4ddc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72908800 unmapped: 21348352 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 97 ms_handle_reset con 0x56144cc72c00 session 0x56144b691dc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:11.336370+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.220746994s of 10.319797516s, submitted: 71
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 72941568 unmapped: 21315584 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:12.336657+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 99 ms_handle_reset con 0x56144cc79000 session 0x56144a823180
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc69800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 74219520 unmapped: 20037632 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 99 heartbeat osd_stat(store_statfs(0x4fc29c000/0x0/0x4ffc00000, data 0xcea072/0xd8e000, compress 0x0/0x0/0x0, omap 0x112a8, meta 0x2bbed58), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc69800 session 0x56144a823dc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:13.336811+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 20299776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:14.337160+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 20299776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:15.337299+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 721044 data_alloc: 218103808 data_used: 8831
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 73957376 unmapped: 20299776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:16.337664+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x561449e33000 session 0x56144ca00e00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cbc4400 session 0x56144a5e48c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc72c00 session 0x56144a79b500
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 73973760 unmapped: 20283392 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 heartbeat osd_stat(store_statfs(0x4fc295000/0x0/0x4ffc00000, data 0xceba87/0xd93000, compress 0x0/0x0/0x0, omap 0x113fe, meta 0x2bbec02), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc79000 session 0x56144a862540
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:17.337824+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc69000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc69000 session 0x56144a5e5880
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x561449e33000 session 0x56144a822540
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cbc4400 session 0x56144b4061c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc72c00 session 0x56144a823340
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc79000 session 0x56144b407c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75317248 unmapped: 18939904 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc68c00 session 0x56144ca4cfc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:18.338159+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cc68c00 session 0x56144ca4ce00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75448320 unmapped: 18808832 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x561449e33000 session 0x56144b6901c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:19.338483+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 ms_handle_reset con 0x56144cbc4400 session 0x56144b691c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc72c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc79000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75505664 unmapped: 18751488 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:20.338750+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 724938 data_alloc: 218103808 data_used: 9359
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75513856 unmapped: 18743296 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:21.338977+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 heartbeat osd_stat(store_statfs(0x4fc298000/0x0/0x4ffc00000, data 0xceba97/0xd94000, compress 0x0/0x0/0x0, omap 0x117c7, meta 0x2bbe839), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75522048 unmapped: 18735104 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:22.339237+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 101 ms_handle_reset con 0x56144b455400 session 0x56144ca01a40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75849728 unmapped: 18407424 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.013634682s of 11.103779793s, submitted: 68
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:23.339486+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 101 handle_osd_map epochs [101,102], i have 101, src has [1,102]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 102 ms_handle_reset con 0x56144b455800 session 0x56144b407dc0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 102 ms_handle_reset con 0x56144b455400 session 0x56144a2c8000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 102 ms_handle_reset con 0x561449e33000 session 0x56144b407180
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 102 ms_handle_reset con 0x56144cbc4400 session 0x56144ca71500
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc68c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75816960 unmapped: 18440192 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:24.339752+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 103 ms_handle_reset con 0x56144cc68c00 session 0x56144a823880
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75980800 unmapped: 18276352 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:25.339986+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 104 ms_handle_reset con 0x56144b455c00 session 0x56144ca4cc40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 740651 data_alloc: 218103808 data_used: 9871
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75980800 unmapped: 18276352 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:26.340181+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75980800 unmapped: 18276352 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 104 heartbeat osd_stat(store_statfs(0x4fc286000/0x0/0x4ffc00000, data 0xcf1598/0xda2000, compress 0x0/0x0/0x0, omap 0x12132, meta 0x2bbdece), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:27.340343+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449e33000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 104 ms_handle_reset con 0x561449e33000 session 0x56144ca59c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144b455400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76005376 unmapped: 18251776 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:28.340509+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 105 ms_handle_reset con 0x56144b455400 session 0x56144b407a40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 105 heartbeat osd_stat(store_statfs(0x4fc28b000/0x0/0x4ffc00000, data 0xcf1588/0xda1000, compress 0x0/0x0/0x0, omap 0x11fdc, meta 0x2bbe024), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76054528 unmapped: 18202624 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:29.340671+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 105 heartbeat osd_stat(store_statfs(0x4fc287000/0x0/0x4ffc00000, data 0xcf2783/0xda2000, compress 0x0/0x0/0x0, omap 0x11ddb, meta 0x2bbe225), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 105 ms_handle_reset con 0x56144cc72c00 session 0x56144ca8ce00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 105 ms_handle_reset con 0x56144cc79000 session 0x56144ca00c40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76054528 unmapped: 18202624 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:30.340823+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 105 handle_osd_map epochs [105,106], i have 105, src has [1,106]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 742408 data_alloc: 218103808 data_used: 14470
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 106 ms_handle_reset con 0x56144cbc4400 session 0x56144c8afa40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 106 handle_osd_map epochs [106,107], i have 106, src has [1,107]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:31.340982+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:32.341120+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 107 heartbeat osd_stat(store_statfs(0x4fc281000/0x0/0x4ffc00000, data 0xcf527c/0xda7000, compress 0x0/0x0/0x0, omap 0x126e7, meta 0x2bbd919), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:33.341391+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 107 ms_handle_reset con 0x561449a3b400 session 0x56144a5e4700
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 107 ms_handle_reset con 0x56144c86ec00 session 0x56144a822c40
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cbc4400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:34.341594+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.015930176s of 11.220238686s, submitted: 138
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 107 ms_handle_reset con 0x56144cbc4400 session 0x56144c82d6c0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 107 heartbeat osd_stat(store_statfs(0x4fc281000/0x0/0x4ffc00000, data 0xcf527c/0xda7000, compress 0x0/0x0/0x0, omap 0x126e7, meta 0x2bbd919), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75685888 unmapped: 18571264 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144cc66800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:35.341709+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 107 ms_handle_reset con 0x56144cc66800 session 0x56144b691880
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144dcf4c00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 107 heartbeat osd_stat(store_statfs(0x4fc286000/0x0/0x4ffc00000, data 0xcf526c/0xda6000, compress 0x0/0x0/0x0, omap 0x12773, meta 0x2bbd88d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 743999 data_alloc: 218103808 data_used: 14454
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _renew_subs
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 108 ms_handle_reset con 0x56144dcf4c00 session 0x56144ca4d340
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 18546688 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 108 heartbeat osd_stat(store_statfs(0x4fc286000/0x0/0x4ffc00000, data 0xcf526c/0xda6000, compress 0x0/0x0/0x0, omap 0x12773, meta 0x2bbd88d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:36.341830+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75710464 unmapped: 18546688 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:37.342071+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 108 heartbeat osd_stat(store_statfs(0x4fc284000/0x0/0x4ffc00000, data 0xcf68c5/0xda8000, compress 0x0/0x0/0x0, omap 0x12a4c, meta 0x2bbd5b4), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:38.342220+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:39.342398+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:40.342554+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 746645 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:41.342773+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:42.343031+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:43.343333+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 109 heartbeat osd_stat(store_statfs(0x4fc27f000/0x0/0x4ffc00000, data 0xcf7d91/0xdab000, compress 0x0/0x0/0x0, omap 0x12ab8, meta 0x2bbd548), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:44.343632+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:45.343849+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 109 heartbeat osd_stat(store_statfs(0x4fc27f000/0x0/0x4ffc00000, data 0xcf7d91/0xdab000, compress 0x0/0x0/0x0, omap 0x12ab8, meta 0x2bbd548), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 750139 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:46.344034+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:47.344307+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:48.344559+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.987587929s of 14.052734375s, submitted: 37
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:49.344811+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75718656 unmapped: 18538496 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:50.345076+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:51.345279+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:52.345456+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:53.345694+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:54.345861+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:55.345985+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:56.346219+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:57.346435+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:58.346609+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:32:59.346808+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:00.346928+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:01.347120+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:02.347318+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:03.347711+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:04.348267+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:05.348450+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:06.348674+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:07.348862+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:08.349084+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:09.349311+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:10.349529+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:11.349719+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:12.349905+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:13.350106+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:14.350399+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:15.350598+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:16.350779+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:17.350906+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:18.351025+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:19.351143+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:20.351290+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:21.351481+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:22.351653+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:23.351813+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:24.352019+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:25.352212+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:26.352360+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:27.352634+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:28.352802+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:29.352954+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:30.353119+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:31.353297+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:32.353463+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:33.353688+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:34.353914+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:35.354061+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:36.354210+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:37.354357+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:38.354541+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:39.354682+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:40.354944+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:41.355128+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:42.355352+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:43.355534+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:44.355772+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:45.355937+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:46.356096+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:47.356299+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:48.356511+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:49.356671+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:50.356823+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread fragmentation_score=0.000120 took=0.000018s
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75595776 unmapped: 18661376 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:51.356998+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:52.357165+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:53.357340+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:54.357590+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:55.357729+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:56.357901+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:57.358085+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:58.358215+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:33:59.358375+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:00.358492+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:01.358667+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:02.358804+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:03.358952+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:04.359104+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:05.359223+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:06.359334+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:07.359468+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:08.359589+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:09.359716+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:10.359843+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:11.359977+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:12.360118+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75603968 unmapped: 18653184 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:13.360256+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 75726848 unmapped: 18530304 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:14.360418+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'config show' '{prefix=config show}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 17915904 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:15.360542+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76374016 unmapped: 17883136 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:16.360688+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 17965056 heap: 94257152 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:17.360811+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'perf dump' '{prefix=perf dump}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 28852224 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:18.360934+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'perf schema' '{prefix=perf schema}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:19.361051+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:20.361175+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:21.361295+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:22.361418+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:23.361534+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:24.361688+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:25.361809+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:26.361918+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:27.362079+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:28.362260+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:29.362394+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:30.362518+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:31.362681+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:32.362822+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:33.363007+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:34.363184+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:35.363332+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:36.363471+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:37.363607+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:38.363775+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:39.363899+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:40.364099+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:41.364265+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:42.364439+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:43.364676+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:44.364856+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:45.365027+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:46.365224+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:47.365437+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:48.365620+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:49.365755+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:50.365918+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:51.366085+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:52.366233+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:53.366430+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:54.366619+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:55.366748+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:56.366946+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:57.367086+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:58.367212+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:34:59.367409+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:00.367650+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:01.367835+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:02.368061+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:03.368220+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:04.368441+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:05.368644+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:06.368856+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:07.369011+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:08.369200+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:09.369382+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:10.369628+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:11.369924+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:12.370242+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:13.370459+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:14.370758+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:15.370977+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:16.371270+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:17.371549+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:18.371821+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:19.372064+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:20.372323+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:21.372622+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:22.373902+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:23.374367+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:24.375079+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:25.375689+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:26.376688+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:27.376960+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:28.377216+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:29.377469+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:30.377677+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:31.377901+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:32.378090+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:33.378240+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:34.378413+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:35.378636+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:36.378788+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:37.378914+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:38.379062+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:39.379438+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:40.379670+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:41.379840+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:42.380007+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:43.380152+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:44.380494+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:45.380922+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:46.381076+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:47.381235+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:48.381355+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:49.381551+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:50.381760+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:51.381977+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:52.382146+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:53.382313+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:54.382763+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:55.383160+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:56.383630+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:57.383802+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:58.383939+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:35:59.384084+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:00.384270+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:01.384481+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:02.384652+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:03.384899+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:04.385108+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:05.385250+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:06.385450+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:07.385625+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:08.385784+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:09.385998+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:10.386200+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:11.386372+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:12.386592+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:13.386867+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:14.387795+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:15.387963+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76218368 unmapped: 29081600 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:16.388116+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:17.388265+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:18.388386+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:19.388610+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:20.388797+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:21.388947+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:22.389160+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:23.389324+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:24.389547+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:25.389739+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76226560 unmapped: 29073408 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:26.389905+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:27.390043+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:28.390205+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:29.390364+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:30.390511+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:31.390723+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:32.390891+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:33.391054+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:34.391318+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:35.391521+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:36.391692+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:37.391851+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76234752 unmapped: 29065216 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:38.391977+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:39.392135+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:40.392335+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:41.392523+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:42.392681+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:43.392825+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:44.393038+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:45.393199+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:46.393374+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76242944 unmapped: 29057024 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:47.393556+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 29048832 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:48.393791+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 29048832 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:49.393930+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 29048832 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:50.394089+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 29048832 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:51.394245+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:52.394416+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:53.394640+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:54.394804+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:55.394941+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:56.395081+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:57.395226+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:58.395392+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:36:59.395541+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:00.395703+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:01.395857+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76259328 unmapped: 29040640 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:02.396007+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:03.396199+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:04.396465+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:05.396679+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:06.396861+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:07.397068+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:08.397240+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:09.397369+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:10.397509+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:11.397659+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:12.397836+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:13.398029+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:14.398235+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:15.398448+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:16.398633+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:17.398835+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:18.398958+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:19.399100+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76267520 unmapped: 29032448 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:20.399249+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:21.399402+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:22.399614+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:23.399735+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:24.399911+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:25.400048+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:26.400193+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:27.400359+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:28.400510+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:29.400652+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:30.400813+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:31.401050+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:32.401201+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76275712 unmapped: 29024256 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:33.401380+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 29016064 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:34.401778+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 29016064 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:35.402927+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 29016064 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:36.403950+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 29016064 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:37.404419+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 29016064 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:38.405242+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76283904 unmapped: 29016064 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:39.405975+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:40.406693+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:41.407305+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:42.407906+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:43.408419+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:44.408892+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:45.409298+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:46.409451+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:47.409725+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:48.409937+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:49.410139+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:50.410322+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76292096 unmapped: 29007872 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:51.410540+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 28999680 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:52.410803+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 28999680 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:53.411090+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 28999680 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:54.411402+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76308480 unmapped: 28991488 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:55.411630+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76308480 unmapped: 28991488 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:56.411846+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76308480 unmapped: 28991488 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:57.412090+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76308480 unmapped: 28991488 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:58.412313+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76308480 unmapped: 28991488 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:37:59.412557+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76308480 unmapped: 28991488 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:00.412816+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:01.413021+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:02.413302+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:03.413539+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:04.413986+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:05.414221+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:06.414397+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:07.414659+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:08.414866+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:09.415100+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:10.415328+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:11.415535+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:12.415755+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:13.415969+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:14.416349+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:15.416623+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:16.416881+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:17.417085+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:18.417304+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:19.417487+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:20.417716+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:21.417933+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:22.418131+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:23.418293+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:24.418527+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:25.418726+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:26.418959+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:27.419253+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:28.419552+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:29.419812+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:30.419984+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:31.420264+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:32.420637+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:33.420908+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:34.421231+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:35.421618+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:36.421911+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:37.422193+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:38.422380+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:39.422685+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 5730 writes, 23K keys, 5730 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                           Cumulative WAL: 5730 writes, 1063 syncs, 5.39 writes per sync, written: 0.02 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1357 writes, 3840 keys, 1357 commit groups, 1.0 writes per commit group, ingest: 2.14 MB, 0.00 MB/s
                                           Interval WAL: 1357 writes, 612 syncs, 2.22 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:40.423001+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:41.423307+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:42.423717+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:43.424014+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:44.424263+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:45.424549+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: mgrc ms_handle_reset ms_handle_reset con 0x561449e0a800
Dec 03 21:43:53 compute-0 ceph-osd[86059]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/1553116858
Dec 03 21:43:53 compute-0 ceph-osd[86059]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/1553116858,v1:192.168.122.100:6801/1553116858]
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: get_auth_request con 0x56144dcf4c00 auth_method 0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: mgrc handle_mgr_configure stats_period=5
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:46.424854+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:47.425148+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:48.425457+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:49.425747+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:50.426036+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 ms_handle_reset con 0x561449e32c00 session 0x56144a79aa80
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x561449a3b400
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:51.426283+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:52.426489+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:53.426660+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:54.426855+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76455936 unmapped: 28844032 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:55.427041+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:56.427436+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:57.427653+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:58.427827+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:38:59.428019+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:00.428330+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:01.428482+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:02.428724+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:03.428899+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:04.429090+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:05.429314+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:06.429498+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:07.429680+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:08.429804+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:09.430019+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:10.430246+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:11.430410+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:12.430893+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:13.431074+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:14.431262+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:15.431453+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:16.431599+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:17.431706+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:18.431865+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:19.432014+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:20.432172+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:21.432367+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:22.432531+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:23.432752+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:24.432956+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:25.433135+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:26.433320+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:27.433550+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:28.433775+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 ms_handle_reset con 0x56144a75f800 session 0x56144b55c000
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: handle_auth_request added challenge on 0x56144c86ec00
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:29.434057+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:30.434238+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 28835840 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:31.434398+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:32.434653+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:33.434797+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:34.435390+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:35.435606+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:36.435800+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:37.435951+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:38.436271+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:39.436441+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:40.437233+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:41.437764+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:42.438197+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:43.439912+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:44.440519+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:45.440664+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:46.440908+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:47.441124+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:48.441428+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:49.441635+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:50.441768+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:51.442808+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:52.443034+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:53.443264+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:54.443452+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:55.443683+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:56.443859+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:57.443998+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:58.444180+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:39:59.444344+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:00.444511+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:01.444754+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:02.444908+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:03.445177+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:04.445472+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:05.445676+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:06.445795+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:07.445947+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:08.446102+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:09.446279+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:10.446437+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:11.446653+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:12.446824+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:13.447120+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:14.447333+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:15.447493+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:16.447707+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:17.447898+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:18.448141+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:19.448326+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:20.448537+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:21.448679+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:22.448859+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:23.448989+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:24.449249+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:25.449515+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:26.449693+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:27.449861+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:28.449984+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:29.450112+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:30.450311+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:31.450504+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:32.450715+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:33.450893+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:34.451090+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:35.451284+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:36.451473+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:37.451650+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:38.451812+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:39.451948+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:40.452097+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:41.452240+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:42.452418+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:43.452640+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:44.453345+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:45.453476+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:46.453649+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:47.453865+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:48.454015+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:49.454199+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:50.454383+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:51.454619+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:52.454786+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:53.454978+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:54.455287+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:55.455700+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:56.455882+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:57.456027+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76210176 unmapped: 29089792 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:58.456200+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76152832 unmapped: 29147136 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:40:59.456382+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76152832 unmapped: 29147136 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:00.456472+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:01.456634+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:02.456848+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:03.457019+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:04.457305+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:05.457508+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:06.457660+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:07.457873+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:08.458104+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76161024 unmapped: 29138944 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:09.458262+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:10.458425+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:11.458619+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:12.458808+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:13.459043+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:14.459282+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:15.459450+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:16.459664+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:17.460175+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:18.460411+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:19.460763+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:20.461087+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:21.461490+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:22.461791+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:23.462135+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:24.462364+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76169216 unmapped: 29130752 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:25.462684+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:26.462910+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:27.463170+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:28.463343+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76177408 unmapped: 29122560 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:29.463595+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:30.463734+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:31.463990+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:32.464238+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:33.464454+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:34.464702+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:35.464893+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76185600 unmapped: 29114368 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:36.465109+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:37.465327+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:38.465506+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:39.465724+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:40.466068+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:41.466339+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:42.466527+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:43.466760+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:44.467043+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:45.467195+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76193792 unmapped: 29106176 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:46.467366+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:47.467532+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:48.467754+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:49.467903+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:50.468641+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:51.468881+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:52.469089+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:53.469405+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:54.469638+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:55.469813+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:56.470023+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:57.470213+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:58.470477+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:41:59.470679+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:00.470991+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:01.471288+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:02.471468+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:03.471730+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:04.472000+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:05.472276+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:06.472543+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:07.499463+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:08.499729+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:09.499983+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:10.500266+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:11.500506+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:12.500676+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:13.500911+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:14.501138+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:15.501494+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:16.501702+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:17.501895+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:18.502146+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:19.502344+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:20.502634+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:21.503075+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:22.503348+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:23.503778+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:24.504137+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:25.504316+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:26.504558+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:27.504796+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:28.505065+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:29.505272+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:30.505612+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:31.505924+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:32.506209+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:33.506543+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:34.506900+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:35.507053+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:36.507235+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:37.507470+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:38.507689+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:39.507913+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:40.508083+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:41.508331+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:42.508532+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:43.508818+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:44.509099+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:45.509317+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:46.509480+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:47.509716+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:48.509885+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:49.510088+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:50.510252+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:51.510442+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:52.510692+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:53.510864+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:54.511062+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:55.511190+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:56.511421+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:57.511608+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:58.511762+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:42:59.511889+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:00.512084+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:01.512276+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:02.512439+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:03.512600+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:04.512774+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:05.512921+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:06.513046+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:07.513172+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:08.513353+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:09.513498+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:10.513626+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:11.513771+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:12.513891+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:13.514020+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:14.514158+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:15.514307+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:16.514457+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:17.514680+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:18.514850+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 03 21:43:53 compute-0 ceph-osd[86059]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 03 21:43:53 compute-0 ceph-osd[86059]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 752913 data_alloc: 218103808 data_used: 14419
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:19.515105+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76201984 unmapped: 29097984 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:20.515355+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'config diff' '{prefix=config diff}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'config show' '{prefix=config show}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76546048 unmapped: 28753920 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'counter dump' '{prefix=counter dump}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'counter schema' '{prefix=counter schema}'
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:21.515533+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 28893184 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: tick
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_tickets
Dec 03 21:43:53 compute-0 ceph-osd[86059]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-03T21:43:22.515611+0000)
Dec 03 21:43:53 compute-0 ceph-osd[86059]: osd.0 110 heartbeat osd_stat(store_statfs(0x4fc27c000/0x0/0x4ffc00000, data 0xcf9241/0xdae000, compress 0x0/0x0/0x0, omap 0x12b63, meta 0x2bbd49d), peers [1,2] op hist [])
Dec 03 21:43:53 compute-0 ceph-osd[86059]: prioritycache tune_memory target: 4294967296 mapped: 76472320 unmapped: 28827648 heap: 105299968 old mem: 2845415832 new mem: 2845415832
Dec 03 21:43:53 compute-0 ceph-osd[86059]: do_command 'log dump' '{prefix=log dump}'
Dec 03 21:43:54 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15148 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:54 compute-0 ceph-mon[75204]: pgmap v1166: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:54 compute-0 ceph-mon[75204]: from='client.15136 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:54 compute-0 ceph-mon[75204]: from='client.15138 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:54 compute-0 ceph-mon[75204]: from='client.15140 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:54 compute-0 ceph-mon[75204]: from='client.15142 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 03 21:43:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/982875438' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 03 21:43:54 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15152 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:54 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1167: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:54 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15156 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:54 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "versions"} v 0)
Dec 03 21:43:54 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3628276245' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 03 21:43:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:43:55 compute-0 ceph-mon[75204]: from='client.15144 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:55 compute-0 ceph-mon[75204]: from='client.15148 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:55 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/982875438' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Dec 03 21:43:55 compute-0 ceph-mon[75204]: from='client.15152 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:55 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3628276245' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Dec 03 21:43:55 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15158 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:55 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 03 21:43:55 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2329839099' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 03 21:43:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 03 21:43:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2788423869' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 03 21:43:56 compute-0 podman[265597]: 2025-12-03 21:43:56.166342224 +0000 UTC m=+0.104546126 container health_status ca3c9ffcca83e311ddd4b17245064c8421e62312b554af833cb488679e04a66c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 03 21:43:56 compute-0 podman[265595]: 2025-12-03 21:43:56.173014844 +0000 UTC m=+0.110985180 container health_status 2dbb897091576d016ef1909f1fe5c92482cf70641de50d59fe0f3cebe24c4b4c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 03 21:43:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 03 21:43:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 03 21:43:56 compute-0 ceph-mon[75204]: pgmap v1167: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:56 compute-0 ceph-mon[75204]: from='client.15156 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:56 compute-0 ceph-mon[75204]: from='client.15158 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 03 21:43:56 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2329839099' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Dec 03 21:43:56 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2788423869' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Dec 03 21:43:56 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 03 21:43:56 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 03 21:43:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 03 21:43:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 03 21:43:56 compute-0 systemd[1]: Starting Hostname Service...
Dec 03 21:43:56 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1168: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:56 compute-0 systemd[1]: Started Hostname Service.
Dec 03 21:43:56 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 03 21:43:56 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2248671052' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 03 21:43:57 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15174 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:57 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 03 21:43:57 compute-0 ceph-mon[75204]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 03 21:43:57 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/2248671052' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Dec 03 21:43:57 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 03 21:43:57 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/713723560' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 03 21:43:58 compute-0 ceph-mon[75204]: pgmap v1168: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:58 compute-0 ceph-mon[75204]: from='client.15174 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:43:58 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/713723560' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Dec 03 21:43:58 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "df"} v 0)
Dec 03 21:43:58 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1705632929' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 03 21:43:58 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1169: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:43:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 03 21:43:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1129719214' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 03 21:43:59 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1705632929' entity='client.admin' cmd={"prefix": "df"} : dispatch
Dec 03 21:43:59 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1129719214' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Dec 03 21:43:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 03 21:43:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3564759909' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 03 21:43:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 03 21:43:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3565795356' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:43:59 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 03 21:43:59 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3565795356' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:44:00 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15184 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:44:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 03 21:44:00 compute-0 ceph-mon[75204]: pgmap v1169: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:44:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3564759909' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Dec 03 21:44:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3565795356' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Dec 03 21:44:00 compute-0 ceph-mon[75204]: from='client.? 192.168.122.10:0/3565795356' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Dec 03 21:44:00 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1170: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:44:00 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec 03 21:44:00 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3357124873' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 03 21:44:01 compute-0 ceph-mon[75204]: from='client.15184 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:44:01 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3357124873' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Dec 03 21:44:01 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec 03 21:44:01 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3493588679' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 03 21:44:01 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15194 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 03 21:44:02 compute-0 ceph-mon[75204]: mon.compute-0@0(leader) e1 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec 03 21:44:02 compute-0 ceph-mon[75204]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1735159671' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 03 21:44:02 compute-0 ceph-mon[75204]: pgmap v1170: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:44:02 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/3493588679' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Dec 03 21:44:02 compute-0 ceph-mon[75204]: from='client.? 192.168.122.100:0/1735159671' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Dec 03 21:44:02 compute-0 ceph-mgr[75500]: log_channel(cluster) log [DBG] : pgmap v1171: 177 pgs: 177 active+clean; 41 MiB data, 177 MiB used, 60 GiB / 60 GiB avail
Dec 03 21:44:02 compute-0 ceph-mgr[75500]: log_channel(audit) log [DBG] : from='client.15198 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
